With Electron/Node.js, how do I implement simple sequential code asynchronously? - javascript

I am working on a project where my Electron App interacts with a physical device using serial commands, via serialport. The app sends a string to the device, the device executes the command (which can take ~30s) and then sends back a string to signify completion and results from that operation.
My goal is to automate a series of actions. For that, basically the following needs to be done asynchronously, so that the render thread doesn't get blocked:
Start a loop
Send a string to the device
Wait until a specific response comes back
Tell the render thread about the response, so it can update the UI
Afterwards, repeat with the next string.
Actually, multiple different commands need to be send in each loop cycle, and between each one the app has to wait for a specific string from the device.
This is kind of related to my last question, What's the correct way to run a function asynchronously in Electron?. From that, I know I should use web workers to run something asynchronously. However, my plan turned out to involve more problems than I anticipated, and I wanted to ask what would be a good way to implement this, having the whole plan in mind and not just a certain aspect of it.
I am especially not sure how to make the worker work with serialport. The serial device it needs to interact with is a child of the render process, so sending commands will probably be done over web worker messages. But I have no idea on how to make the worker wait for a specific response from the device.
(Since this question is of a more general nature, I am unsure whether I should provide some code snippets. If this is to general, I can try to write some pseudo code to make my problem more clear.)

I would go for a promise-based approach like this:
let promiseChain = Promise.resolve();
waitForEvent = function(){
return new Promise(resolve=>{
event.on("someEvent", (eventData => {
resolve(eventData)
}))
})
}
while(someLoopCondition) {
promiseChain = promiseChain
.then(sendToSerialPort(someString))
.then(waitForEvent)
.then(result=>{
updateUI(result)
})
}

Related

Handle Multiple Concurent Requests for Express Sever on Same Endpoint API

this question might be duplicated but I am still not getting the answer. I am fairly new to node.js so I might need some help. Many have said that node.js is perfectly free to run incoming requests asynchronously, but the code below shows that if multiple requests hit the same endpoint, say /test3, the callback function will:
Print "test3"
Call setTimeout() to prevent blocking of event loop
Wait for 5 seconds and send a response of "test3" to the client
My question here is if client 1 and client 2 call /test3 endpoint at the same time, and the assumption here is that client 1 hits the endpoint first, client 2 has to wait for client 1 to finish first before entering the event loop.
Can anybody here tells me if it is possible for multiple clients to call a single endpoint and run concurrently, not sequentially, but something like 1 thread per connection kind of analogy.
Of course, if I were to call other endpoint /test1 or /test2 while the code is still executing on /test3, I would still get a response straight from /test2, which is "test2" immediately.
app.get("/test1", (req, res) => {
console.log("test1");
setTimeout(() => res.send("test1"), 5000);
});
app.get("/test2", async (req, res, next) => {
console.log("test2");
res.send("test2");
});
app.get("/test3", (req, res) => {
console.log("test3");
setTimeout(() => res.send("test3"), 5000);
});
For those who have visited, it has got nothing to do with blocking of event loop.
I have found something interesting. The answer to the question can be found here.
When I was using chrome, the requests keep getting blocked after the first request. However, with safari, I was able to hit the endpoint concurrently. For more details look at the following link below.
GET requests from Chrome browser are blocking the API to receive further requests in NODEJS
Run your application in cluster. Lookup Pm2
This question needs more details to be answer and is clearly an opinion-based question. just because it is an strawman argument I will answer it.
first of all we need to define run concurrently, it is ambiguous if we assume the literal meaning in stric theory nothing RUNS CONCURRENTLY
CPUs can only carry out one instruction at a time.
The speed at which the CPU can carry out instructions is called the clock speed. This is controlled by a clock. With every tick of the clock, the CPU fetches and executes one instruction. The clock speed is measured in cycles per second, and 1c/s is known as 1 hertz. This means that a CPU with a clock speed of 2 gigahertz (GHz) can carry out two thousand million (or two billion for those in the US) for the rest of us/world 2000 million cycles per second.
cpu running multiple task "concurrently"
yes you're right now-days computers even cell phones comes with multi core which means the number of tasks running at the same time will depend upon the number of cores, but If you ask any expert such as this Associate Staff Engineer AKA me will tell you that is very very rarely you'll find a server with more than one core. why would you spend 500 USD for a multi core server if you can spawn a hold bunch of ...nano or whatever option available in the free trial... with kubernetes.
Another thing. why would you handle/configurate node to be incharge of the routing let apache and/or nginx to worry about that.
as you mentioned there is one thing call event loop which is a fancy way of naming a Queue Data Structure FIFO
so in other words. no, NO nodejs as well as any other programming language out there will run
but definitly it depends on your infrastructure.

NodeJS Returning data to client browser

I think we need some help here. Thanks in advance.
I have been doing programming in .Net for desktop applications and have used Timer objects to wait for a task to complete before the task result are shown in a data grid. Recently, we switched over to NodeJs and find it pretty interesting. We could design a small application that executes some tasks using PowerShell scripts and return the data to the client browser. However, I would have to execute a Timer on the client browser (when someone clicks on a button) to see if the file, that Timer receives from the server, has "ENDOFDATA" or not. Once the Timer sees ENDOFDATA it triggers another function to populate DIV with the data that was received from the server.
Is this the right way to get the data from a server? We really don't want to block EventLoop. We run PowerShell scripts on NodeJS to collect users from Active Directory and then send the data back to the client browser. The PowerShell scripts are executed as a Job so EventLoop is not blocked.
Here is an example of the code at NodeJs:
In the below code can we insert something that won't block the EventLoop but still respond to the server once the task is completed? As you can see in the code below, we would like to send the ADUsers.CSV file to the client browser once GetUsers.PS1 has finished executing. Since GetUSers.PS1 takes about five minutes to complete the Event Loop is blocked and the Server can no longer accept any other requests.
app.post("/LoadDomUsers", (request, response) => {
//we check if the request is an AJAX one and if accepts JSON
if (request.xhr || request.accepts("json, html") === "json") {
var ThisAD = request.body.ThisAD
console.log(ThisAD);
ps.addCommand("./public/ps/GetUsers.PS1", [{
name: 'AllParaNow',
value: ScriptPara
}])
ps.addCommand(`$rc = gc ` + __dirname + "/public/TestData/AD/ADUsers.CSV");
ps.addCommand(`$rc`);
ps.invoke().then((output) => {
response.send({ message: output });
console.log(output);
});
}
});
Thank you.
The way you describe your problem isn't that clear. I had to read some of the comments in your initial question just to be sure I understood the issue. Honestly, you could just utilize various CSV NPM packages to read and write from your active directory with NodeJS.
I/O is non-blocking with NodeJS, so you're not actually blocking the EventLoop. You can handle multiple I/O requests, since NodeJS will just create threads for each one,
and continue execution on the main thread until the I/O operations complete and send back the data to its function reference, adding them as functions to the callstack and resuming program execution from those function's references. After you get the I/O data, you just send it back to the client through the response object. There should be no timers needed.
So is the issue once the powershell script runs, you have to wait for that initial script to complete before being able to handle pending requests? I'm still a bit unclear...

Manually update web worker messages

I have a web worker in which I'm trying to run as little asynchronous code as possible.
I would like to have a while loop in my web worker while still allowing messages to be processed. Is there a way to manually update the event system in the browser? Or at least update the web worker's messages?
There appears to be something like this in Node.js (process._tickDomainCallback()) but so far I haven't found anything for web.
Using a setTimeout is not an option. I would like either a solution or a definitive answer that this is simply not possible.
// worker.js
self.onmessage = function(e) {
console.log("Receive Message");
};
while (true) {
UpdateMessages(); // Receive and handle incoming messages
// Do other stuff
}
Not sure what your context is here ... I had an issue where the browser side Web Audio API event loop was getting interrupted at inopportune moments by WebSocket traffic coming in from my nodejs server so I added a WebWorker middle layer to free up the browser event loop from ever getting interruptions
The WebWorker handled all network traffic then populated a circular queue with data ... when the browser event loop deemed itself available it plucked data from this shared queue (using Transferable Object buffer) and so was never interrupted since it was the one initiating calls to the WebWorker
I feel your pain however this approach kept the event loop happy ... since a WebWorker is effectively on its own thread and from the browser side is doing async work, there is no need to minimize async code

Node.js API to spawn off a call to another API

I created a Node.js API.
When this API gets called I return to the caller fairly quickly. Which is good.
But now I also want API to call or launch an different API or function or something that will go off and run on it's own. Kind of like calling a child process with child.unref(). In fact, I would use child.spawn() but I don't see how to have spawn() call another API. Maybe that alone would be my answer?
Of this other process, I don't care if it crashes or finishes without error.
So it doesn't need to be attached to anything. But if it does remain attached to the Node.js console then icing on the cake.
I'm still thinking about how to identify & what to do if the spawn somehow gets caught up in running a really long time. But ready to cross that part of this yet.
Your thoughts on what I might be able to do?
I guess I could child.spawn('node', [somescript])
What do you think?
I would have to explore if my cloud host will permit this too.
You need to specify exactly what the other spawned thing is supposed to do. If it is calling an HTTP API, with Node.js you should not launch a new process to do that. Node is built to run HTTP requests asynchronously.
The normal pattern, if you really need some stuff to happen in a different process, is to use something like a message queue, the cluster module, or other messaging/queue between processes that the worker will monitor, and the worker is usually set up to handle some particular task or set of tasks this way. It is pretty unusual to be spawning another process after receiving an HTTP request since launching new processes is pretty heavy-weight and can use up all of your server resources if you aren't careful, and due to Node's async capabilities usually isn't necessary especially for things mainly involving IO.
This is from a test API I built some time ago. Note I'm even passing a value into the script as a parameter.
router.put('/test', function (req, res, next) {
var u = req.body.u;
var cp = require('child_process');
var c = cp.spawn('node', ['yourtest.js', '"' + u + '"'], { detach: true });
c.unref();
res.sendStatus(200);
});
The yourtest.js script can be just about anything you want it to be. But I thought I would have enjoy learning more if I thought to first treat the script as a node.js console desktop app. FIRST get your yourtest.js script to run without error by manually running/testing it from your console's command line node yourstest.js yourparamtervalue THEN integrate it in to the child.spawn()
var u = process.argv[2];
console.log('f2u', u);
function f1() {
console.log('f1-hello');
}
function f2() {
console.log('f2-hello');
}
setTimeout(f2, 3000); // wait 3 second before execution f2(). I do this just for troubleshooting. You can watch node.exe open and then close in TaskManager if node.exe is running long enough.
f1();

Is google apps script synchronous?

I'm a Java developer learning JavaScript and Google Apps Script simultaneously. Being the newbie I learned the syntax of JavaScript, not how it actually worked and I happily hacked away in Google Apps Script and wrote code sequentially and synchronous, just like Java. All my code resembles this: (grossly simplified to show what I mean)
function doStuff() {
var url = 'https://myCompany/api/query?term<term&search';
var json = getJsonFromAPI(url);
Logger.log(json);
}
function getJsonFromAPI(url) {
var response = UrlFetchApp.fetch(url);
var json = JSON.parse(response);
return json;
}
And it works! It works just fine! If I didn't keep on studying JavaScript, I'd say it works like a clockwork. But JavaScript isn't a clockwork, it's gloriously asynchronous and from what I understand, this should not work at all, it would "compile", but logging the json variable should log undefined, but it logs the JSON with no problem.
NOTE:
The code is written and executed in the Google Sheet's script editor.
Why is this?
While Google Apps Script implements a subset of ECMAScript 5, there's nothing forcing it to be asynchronous.
While it is true that JavaScript's major power is its asynchronous nature, the Google developers appear to have given that up in favor of a simpler, more straightforward API.
UrlFetchApp methods are synchronous. They return an HttpResponse object, and they do not take a callback. That, apparently, is an API decision.
Please note that this hasn't really changed since the introduction of V8 runtime for google app scripts.
While we are on the latest and greatest version of ECMAScript, running a Promise.all(func1, func2) I can see that the code in the second function is not executed until the first one is completed.
Also, there is still no setTimeout() global function to use in order to branch the order of execution. Nor do any of the APIs provide callback functions or promise-like results. Seems like the going philosophy in GAS is to make everything synchronous.
I'm guessing from Google's point of view, that parallel processing two tasks (for example, that simply had Utilities.sleep(3000)) would require multiple threads to run in the server cpu, which may not be manageable and may be easy to abuse.
Whereas parallel processing on the client or other companies server (e.g., Node.js) is up to that developer or user. (If they don't scale well it's not Google's problem)
However there are some things that use parallelism
UrlFetchApp.fetchAll
UrlFetchApp.fetchAll() will asynchronously fetch many urls. Although this is not what you're truly looking for, fetching urls is a major reason to seek parallel processing.
I'm guessing Google is reasoning this is ok since fetchall is using a web client and its own resources are already protected by quota.
FirebaseApp getAllData
Firebase I have found is very fast compared to using a spreadsheet for data storage. You can get many things from the database at once using FirebaseApp's getAllData:
function myFunction() {
var baseUrl = "https://samplechat.firebaseio-demo.com/";
var secret = "rl42VVo4jRX8dND7G2xoI";
var database = FirebaseApp.getDatabaseByUrl(baseUrl, secret);
// paths of 3 different user profiles
var path1 = "users/jack";
var path2 = "users/bob";
var path3 = "users/jeane";
Logger.log(database.getAllData([path1, path2, path3]));
}
HtmlService - IFrame mode
HtmlService - IFrame mode allows full multi-tasking by going out to client script where promises are truly supported and making parallel calls back into the server. You can initiate this process from the server, but since all the parallel tasks' results are returned in the client, it's unclear how to get them back to the server. You could make another server call and send the results, but I'm thinking the goal would be to get them back to the script that called HtmlService in the first place, unless you go with a beginRequest and endRequest type architecture.
tanaikech/RunAll
This is a library for running the concurrent processing using only native Google Apps Script (GAS). This library claims full support via a RunAll.Do(workers) method.
I'll update my answer if I find any other tricks.

Categories

Resources