Data stream handlers and CLI incorporation Node JS - javascript

I am trying to create an interactive CLI that can run serial commands. I have two files serialcomms.js and cli.js. Serialcomms.js contains the connection, handlers, and command functions. cli.js contains the commander information.
My issue is that I can only call the send command once because the listeners/handlers take over from the serialcomms file. What would be the best method to loop the cli program so I can call the send command over and over again, but still have the serial handlers running and output to stdout? Would I need to use a child process? Or recursion to have the cli call itself?
Example behavior I am expecting with an echo bot on the other end of the serial line.
Send hello
hello
Send Bye
Bye
Behavior I am experiencing
Send hello
hello
endless wait
Here is my serialcomms.js
const SerialPort = require('serialport');
const ReadLine = require('#serialport/parser-readline');
let portName = `/dev/pts/${process.argv[2]}` || '/dev/pts/6';
let baudRate = process.argv[3] || 115200;
let myPort = new SerialPort(portName, {baudRate: baudRate})
let parser = myPort.pipe(new ReadLine({ delimiter: '\n' }))
myPort.on('open', () => {
console.log(`port ${portName} open`)
})
parser.on('data', (data) => {
console.log(data)
})
myPort.on('close', () => {
console.log(`port ${portName} closed`)
})
myPort.on('error', (err) => {
console.error('port error: ' + err)
})
function send(data){
myPort.write(JSON.stringify(data)+'\n', function(err) {
if (err) {
return console.log('Error on write: ', err.message);
}
console.log(`${data} sent`);
});
}
module.exports = {
send
}
Here is my CLI.js file
const program = require('commander');
const {send} = require('./serialcomms');
program
.version('1.0.0')
.description('Serial Tester')
program
.command('send <msg>')
.alias('s')
.description('send a message over serial')
.action((msg)=>{
send(msg)
})
program.parse(process.argv)

Related

How to pass stream input/output between a main process and a child process in Node.js

I'm trying to get two processes to "talk" to each other via stdio:
ping.js
import { readline } from '../readline';
import { sleep } from '../sleep';
import { spawn } from 'child_process';
const spawnPongProcess = () => {
const child = spawn('node',
['-r', 'esm', `${__dirname}/pong.js`],
{ stdio: 'pipe' });
child.stderr.on('data', (data) => {
console.error(`stderr: ${data}`);
});
child.on('close', (code) => {
console.error(`child process exited with code ${code}`);
process.stdin.resume();
process.stdout.resume();
});
child.on('error', (err) => {
console.error(`child process error: ${err}`);
});
process.stdout.pipe(child.stdin);
child.stdout.pipe(process.stdin);
return child;
};
export const ping = async () => {
const child = spawnPongProcess();
await sleep(1000);
console.log('ping');
let pong = readline();
console.error(`Ping received: ${pong}`);
child.kill();
};
I pipe the parent process' stdout to the child process stdin and the child process stdout to the parent process stdin in an effort to allow the processes to communicate via stdio.
pong.js
import { readline } from '../readline';
import { sleep } from '../sleep';
const pong = async () => {
console.log(`Pong initiated and waiting for input.`);
let ping = readline();
console.log(`Pong received: ${ping}`);
process.exit();
};
pong();
readline.js
import { question } from 'readline-sync';
export const readline = question;
sleep.js
export const sleep = (ms) => {
return new Promise((resolve) => setTimeout(resolve, ms));
};
The output is:
$ node -r esm src/index.js
Pong initiated and waiting for input.
ping
It appears that the output from the parent process (ping) is not getting through to the child process (pong). Any ideas on how to make it work?
You piped your process' stdout (Writable) to child's stdin (Writable) and vice versa. Since data is received on the stdin (Readable), you have to pipe it instead of stdout:
process.stdin.pipe(child.stdin);
child.stdout.pipe(process.stdout);
Your code doesn't throw because if the stdout is a terminal, it becomes a Duplex stream.

How do I handle asynchronous calls when running newman (Postman) collections with node?

I could do this with bash but I'm trying to learn node and would like to do it from there. How do I get the newman run call to be synchronous. I don't really understand the use of async/await (if that is what is required here). I have the following script that loops over a bunch of collection files (that each contain multiple requests) and calls newman run on each of them:
// node imports
const fs = require('fs');
const newman = require('newman');
// test variables
const testFolder = './api-tests/';
// read all files in the test folder
fs.readdirSync(testFolder).forEach(file => {
console.log('Running file: ' + file);
// run newman using the file
newman.run({
collection: require(testFolder + file),
delayRequest: 500,
iterationData: [
{
'host': 'localhost',
'port': '8080'
}
],
reporters: ['cli', 'html']
}, (err, summary) => {
if (err) {
throw err;
}
console.log(file + ' run complete');
});
});
Newman executes each file immediately rather than waiting for the loop to go back around to the next file.
Thanks.
you can use deasync https://github.com/abbr/deasync
var done = false;
fs.readdirSync(testFolder).forEach(file => {
newman.run({
...
}).on('start', function (err, args) { // on start of run, log to console
console.log('running a collection...');
}).on('done', function (err, summary) {
...
done = true;
});
require('deasync').loopWhile(function(){return !done;});
done = false;
}

How to synchronously load a csv file into memory before handling HTTP requests

I am trying to write a simple express/node.js app that responds to GET requests using data found in a csv file. I would like to read this csv file to generate a javascript object (essentially a key-value mapping), and then make that generated map available for the HTTP request handling logic in the controller.
I wrote a module that reads the csv files and exports the desired objects, but I'm not sure how to ensure:
This operation completes and the objects actually exist before HTTP requests are handled
The file operation is performed only a single time when the server starts up and not once per request incurring massive overhead
How can I organize my code to meet these goals in the context of an express app?
This is how I am processing the CSV file:
var myMap = {};
fs.createReadStream('filename.csv')
.pipe(csv())
.on('data', (row) => {
// Build javascript object
myMap[row['key']] = row['value'];
})
.on('end', () => {
console.log('Done.');
});
// Does this work?
module.exports = myMap;
How about ensuring http object listens after the file is loaded into memory:
// server.js
var myMap = {};
function readCsv(cb){
fs.createReadStream('filename.csv')
.pipe(csv())
.on('data', (row) => {
// Build javascript object
myMap[row['key']] = row['value'];
})
.on('end', () => {
console.log('Done.');
cb();
});
}
var app = express();
exports = Object.freeze({
server: http.createServer(app)
init(){
readCsv(() => {
this.server.listen(80)
})
}
})
Something like that.
You can also utilize Promise
// server.js
var myMap = {};
function readCsv(){
return new Promise((resolve, reject) => {
fs.createReadStream('filename.csv')
.pipe(csv())
.on('data', (row) => {
// Build javascript object
myMap[row['key']] = row['value'];
})
.on('end', () => {
console.log('Done.');
resolve();
})
.on('error', reject)
})
}
var app = express();
exports = Object.freeze({
server: http.createServer(app)
init(){
return readCsv().then(() => {
this.server.listen(80)
})
}
})
I would look for more synchronous way to read file and handle http request. Here is sample code of what it should look like,
import fs from 'fs';
async function processCSV() {
try {
let map = await readCsv();
//handle http request in another function with same async await way
let http = await processHttpRequest(map);
// process the http response
} catch (e) {
console.log('e', e);
}
}
function readCsv()
{
let myMap = [];
fs.createReadStream('filename.csv')
.pipe(csv())
.on('data', (row) => {
// Build javascript object
return myMap[row['key']] = row['value'];
})
.on('end', () => {
console.log('Done.');
});
}
async function processHttpRequest(map)
{
try
{
let reqres = await httpReuqest(map); // Your defined function for httpReuqest
}
catch (e)
{
}
}
processCSV();
processHttpReuqet();
In order to meet both of your goals, you can include the code in the app.js file. App.js only runs when the express server starts. It doesn't reload on page refresh. You can run app.listen after the readstream ends.
var myMap = {};
fs.createReadStream('filename.csv')
.pipe(csv())
.on('data', (row) => {
// Build javascript object
myMap[row['key']] = row['value'];
})
.on('end', () => {
app.listen(port, () => console.log(`Example app listening on port ${port}!`));
});
However, since I don't think you're going to have a lot of data, it's better to use a synchronous (blocking) methods, for both the csv parser and file reader. This just makes it easier to understand. I use csv-parse below.
const express = require('express')
const fs = require('fs')
const parse = require('csv-parse/lib/sync')
const app = express()
const port = 3000
/* In this example assume myMap will be
/ `
/ "key_1","key_2"
/ "value 1","value 2"
/ `
*/
var myMap = fs.readFileSync('sample.csv', 'utf8');
/* parsing the csv will return:
/ [Object {key_1: "value 1", key_2: "value 2"}]
*/
const records = parse(myMap, {
columns: true,
skip_empty_lines: true
})
app.get('/', (req, res) => res.send('Hello World!' + records[0].key_1))
app.listen(port, () => console.log(`Example app listening on port ${port}!`))
test it on runkit
Update:
use https://csv.js.org/parse/
Below one is deprecated, not maintained anymore.
Deprecated:
Hi I have created an npm package to read CSV synchronously or as a promise :
https://www.npmjs.com/package/csv-parser-sync-plus-promise
Description:
csv-parser-sync-plus-promise
A module to read csv synchronously or as promise
Features
now read any csv synchronously or as promise. Choice is yours
Usage
let parser = require('csv-parser-sync-plus-promise')
// for sync
let a=parser.readCsvSync('<filepath>')
// for promise
let b=parser.readCsvPromise('<filepath>')
**Note:** You can use both fully qualified and relative paths <filepath>
Errors
All errors will be printed as console.error and the process will exit with exit code 222

How do you create a terminal instance within a NodeJS child process?

I am setting up a discord channel to function as an SSH terminal. A NodeJS server will provide the connection. A custom command will spawn a new terminal instance, which can then be used as a shell.
I don't know how to spawn a terminal within a child process. I have tried using the screen and bash commands to no avail.
I am using CentOS 7.
// Code For Discord
var $discord = {
currentInterface: null,
send: (data) => {
/* some code that sends data to a discord channel */
},
receive: (data) => {
// Send Data To Terminal
if ($discord.currentInterface) {
$discord.currentInterface.send(data);
} else {
$discord.send('**Error:** Terminal has not been spawned.');
}
},
command: (name, args) => {
// Recieve Discord Commands
switch (name) {
case 'spawn':
$discord.currentInterface = $interface();
break;
}
}
};
// Create Interface
var $interface = function () {
// Define object
let x = {
terminal: child_process.spawn('screen'),
send: (data) => {
// Send Input to Terminal
x.process.stdin.write(data + '\n');
},
receive: (data) => {
// Send Output to Discord
$discord.send(data);
}
};
// Process Output
x.terminal.on('stdout', (data) => {
x.receive(data);
});
// Process Errors
x.terminal.on('stderr', (error) => {
x.receive(`**Error:**\n${error}`);
});
// Return
return x;
};
The problem lies with creating the terminal itself. How do you create an SSH-style shell within a child process?
After realizing how much of an idiot I really am, I found a solution...
// Import Modules
const fs = require('fs');
const child_process = require('child_process');
// Create Interface
var interface = {
terminal: child_process.spawn('/bin/sh'),
handler: console.log,
send: (data) => {
interface.terminal.stdin.write(data + '\n');
},
cwd: () => {
let cwd = fs.readlinkSync('/proc/' + interface.terminal.pid + '/cwd');
interface.handler({ type: 'cwd', data: cwd });
}
};
// Handle Data
interface.terminal.stdout.on('data', (buffer) => {
interface.handler({ type: 'data', data: buffer });
});
// Handle Error
interface.terminal.stderr.on('data', (buffer) => {
interface.handler({ type: 'error', data: buffer });
});
// Handle Closure
interface.terminal.on('close', () => {
interface.handler({ type: 'closure', data: null });
});
Usage...
interface.handler = (output) => {
let data = '';
if (output.data) data += ': ' + output.data.toString();
console.log(output.type + data);
};
interface.send('echo Hello World!');
// Returns: data: Hello World!
interface.send('cd /home');
interface.cwd();
// Returns: cwd: /home
interface.send('abcdef');
// Returns: error: bin/sh: line 2: abcdef: command not found
interface.send('exit');
// Returns: exit
I'd take a look at the documentation for child_process.execFile. There's an option to set the shell on, but it's disabled by default.
There's also this approach if you want to try setting up a batch script. This is set up for windows and the answer isn't set up for passing arguments, but you should be able to adapt it fairly easily.

Calling child_process with Node.js vs calling child process from C and creating a C++ bind to call from node.js

I would like to call pdftotext to extract the content of 100.000 files (and i need to be fast), so, which of these two implementations would be the fastest?
Implementation 1:
Create a child_process from node.js, for every extraction:
export default (file) => new Promise((resove, reject) => {
const command = 'pdftotext'
const args = ['-layout', '-enc', 'UTF-8', file, '-']
const process = spawn(command, args)
const stdout = []
const stderr = []
process.stdout.on('data', (data) => {
stdout.push(data)
})
process.stderr.on('data', (data) => {
stderr.push(data)
})
process.on('error', (error) => {
if (error.code === 'ENOENT')
error.message = 'pdftotext is not installed, so will be unable to extract the file content'
reject(error)
})
process.on('close', () => {
if (stderr.length)
return reject(stderr.map(Error))
resolve(stdout.join())
})
}
Implementation 2:
Create a child_process from C, and create a C++ binding to call from node.js
-- Without code because I'm still learning how to do it --
Most likely process invocation code will have nonessential impact on performance and the speed of document processing depends on pdftotext implementation and disk io. So I guess there is no point to bother with writing custom process launcher.

Categories

Resources