Passing csv file as command like argument in nodeJs - javascript

I can't for the life of me think of a way to do this. I've previously worked with importing csv files into js files, but for this challenge I've got to create a js file that executable from a shell with the data file passed as input.
Any ideas how can this be done?
~The challenge description~
The program must be executable from a shell and take the CSV file as input. For example:
node score_calculator.js data.csv

You can take the file path as command line argument when running your nodejs app like
node myScript.js pathToCsvFile
Now you'll have to get this path in your code as
var filePath = process.argv[2];
Now you can use this path to read your file as
fs.readFile(filePath, (err, data) => {
if (err) throw err;
console.log(data);
});
Read more about file handling in nodejs here
Hope this helps

Sounds like you are trying to figure out how to pass and read parameters
You could do node score_calculator.js data.csv
then in your js
const csv = process.argv[2];
However. If you are passing in parameters I would recommend using minimist then you can do
node score_calculator.js --file=data.csv
And you js file would be
const argv = require('minimist')(process.argv.slice(2));
const csv = argv.file;

Related

Node.JS spawning Python to process binary image gives error 'node error spawn ENAMETOOLONG'

I'm trying to save an image by spawning a python process from a node.js script. The image is passed as a binary data file from a Node.JS script to a Python process.
The Script index.js spawns a python script my_py.py, then passes it a binary image data file. The python script my_py.py captures the binary image data and saves it to the directory assets/media.
The problem is the image isn't saved to the directory and I get the error error spawn ENAMETOOLONG.
Could you please help me spot the problem and to fix the code?
Thanks in advance!
index.js:
const spawn = require("child_process").spawn;
const fs = require('fs');
let params = {
"image":readDataset()
}
// Launch python app
const pythonProcess = spawn('py',['my_py.py', JSON.stringify(params)]);
// Print result
pythonProcess.stdout.on("data", (data) =>{
console.log(data.toString());
});
// Read errors
pythonProcess.stderr.on("data", (data) =>{
console.log(data.toString());
});
function readDataset() {
try {
return fs.readFileSync('color.png', 'binary');
}
catch (err) {
return err;
}
}
my_py.py:
import sys, json
params = json.loads(sys.argv[1]) # Load passed arguments
import numpy as np
import cv2
import os
fileName = "image.png" # file name
fileData = params['image'] # Capture file
# Convert Image to Numpy as array
img = np.array(fileData)
# Save file to local directory
cv2.imwrite(os.path.join('assets/media/', f'{fileName}'), img)
cv2.waitKey(0)
# Results must return valid JSON Object
print("File saved!")
sys.stdout.flush()
The error says it all: the command you are constructing is too long.
The thing you need to be aware of is that operating systems limits how long a command can be. For Linux this is around 4096 bytes though you can modify this value. For Windows this is 8191 bytes and for Mac OS this is around 250k bytes.
Part of the reason for this is because these OSes were written in C/C++ and in C/C++ code, not enforcing buffer size limit is an invitation to buffer overrun (the infamous stack overflow or underflow bug!!). Truly making input size unlimited result in slow code because you will not be using simple arrays for input buffer in that case but more complicated data structures.
Additionally, not having any limit on command length is a vector for DOS attacks. If you run an OS that does not have command size limit and I know for sure you have 32GB of RAM all I need to do to crash your system is construct a 32GB command!!
TLDR
The correct way to pass large data between processes is what you've been doing over the internet - upload/download the data! The simplest implementation is to just pass the data via the stdin of the python process you are connected to:
const spawn = require("child_process").spawn;
const fs = require('fs');
let params = {
"image":readDataset()
}
// Launch python app
const pythonProcess = spawn('py',['my_py.py']);
// Pass image data
pythonProcess.stdin.write(JSON.stringify(params) + '\n');
// Print result
pythonProcess.stdout.on("data", (data) =>{
console.log(data.toString());
});
// Read errors
pythonProcess.stderr.on("data", (data) =>{
console.log(data.toString());
});
function readDataset() {
try {
return fs.readFileSync('color.png', 'base64');
}
catch (err) {
return err;
}
}
In the code above I end the JSON "packet" with a newline so that the python code can read until end of line for a single packet. You can use any convention you like. For example I also often use the nul character (0x00) to mark end of packet and HTTP use two newlines ('\n\n') to mark end of header etc.
In any case I read the image file as base64 because binary data is invalid in JSON. In your python code you can do a base64 decode to get the image back. Additionally base64 does not include newlines ('\n') in its character set so converting the image to base64 ensure you don't get a newline inside your JSON data.
To get the image just read from stdin:
import sys, json, base64
params = json.loads(sys.stdin.readline())
import numpy as np
import cv2
import os
fileName = "image.png" # file name
fileData = base64.b64decode(params['image']) # Capture file
# Convert Image to Numpy as array
img = np.array(fileData)
# Save file to local directory
cv2.imwrite(os.path.join('assets/media/', f'{fileName}'), img)
cv2.waitKey(0)
# Results must return valid JSON Object
print("File saved!")
sys.stdout.flush()

Node.js 'fs' throws an ENOENT error after adding auto-generated Swagger server code

Preamble
To start off, I'm not a developer; I'm just an analyst / product owner with time on their hands. While my team's actual developers have been busy finishing off projects before year-end I've been attempting to put together a very basic API server in Node.js for something we will look at next year.
I used Swagger to build an API spec and then used the Swagger code generator to get a basic Node.js server. The full code is near the bottom of this question.
The Problem
I'm coming across an issue when writing out to a log file using the fs module. I know that the ENOENT error is usually down to just specifying a path incorrectly, but the behaviour doesn't occur when I comment out the Swagger portion of the automatically generated code. (I took the logging code directly out of another tool I built in Node.js, so I'm fairly confident in that portion at least...)
When executing npm start, a few debugging items write to the console:
"Node Server Starting......
Current Directory:/mnt/c/Users/USER/Repositories/PROJECT/api
Trying to log data now!
Mock mode: disabled
PostgreSQL Pool created successfully
Your server is listening on port 3100 (http://localhost:3100)
Swagger-ui is available on http://localhost:3100/docs"
but then fs throws an ENOENT error:
events.js:174
throw er; // Unhandled 'error' event
^
Error: ENOENT: no such file or directory, open '../logs/logEvents2021-12-24.log'
Emitted 'error' event at:
at lazyFs.open (internal/fs/streams.js:277:12)
at FSReqWrap.args [as oncomplete] (fs.js:140:20)
Investigating
Now normally, from what I understand, this would just mean I've got the paths wrong. However, the file has actually been created and the first line of the log file has been written just fine
My next thought was that I must've set the fs flags incorrectly, but it was set to 'a' for append:
var logsFile = fs.createWriteStream(__logdir+"/logEvents"+dateNow()+'.log',{flags: 'a'},(err) =>{
console.error('Could not write new Log File to location: %s \nWith error description: %s',__logdir, err);
});
Removing Swagger Code
Now here's the weird bit: if I remove the Swagger code, the log files write out just fine and I don't get the fs exception!
This is the specific Swagger code:
// swaggerRouter configuration
var options = {
routing: {
controllers: path.join(__dirname, './controllers')
},
};
var expressAppConfig = oas3Tools.expressAppConfig(path.join(__dirname, '/api/openapi.yaml'), options);
var app = expressAppConfig.getApp();
// Initialize the Swagger middleware
http.createServer(app).listen(serverPort, function () {
console.info('Your server is listening on port %d (http://localhost:%d)', serverPort, serverPort);
console.info('Swagger-ui is available on http://localhost:%d/docs', serverPort);
}).on('error',console.error);
When I comment out this code, the log file writes out just fine.
The only thing I can think that might be happening is that somehow Swagger is modifying (?) the app's working directory so that fs no longer finds the same file?
Full Code
'use strict';
var path = require('path');
var fs = require('fs');
var http = require('http');
var oas3Tools = require('oas3-tools');
var serverPort = 3100;
// I am specifically tried using path.join that I found when investigating this issue, and referencing the app path, but to no avail
const __logdir = path.join(__dirname,'./logs');
//These are date and time functions I use to add timestamps to the logs
function dateNow(){
var dateNow = new Date().toISOString().slice(0,10).toString();
return dateNow
}
function rightNow(){
var timeNow = new Date().toTimeString().slice(0,8).toString();
return "["+timeNow+"] "
};
console.info("Node Server Starting......");
console.info("Current Directory: " + __dirname)
// Here I create the WriteStreams
var logsFile = fs.createWriteStream(__logdir+"/logEvents"+dateNow()+'.log',{flags: 'a'},(err) =>{
console.error('Could not write new Log File to location: %s \nWith error description: %s',__logdir, err);
});
var errorsFile = fs.createWriteStream(__logdir+"/errorEvents"+dateNow()+'.log',{flags: 'a'},(err) =>{
console.error('Could not write new Error Log File to location: %s \nWith error description: %s',__logdir, err);
});
// And create an additional console to write data out:
const Console = require('console').Console;
var logOut = new Console(logsFile,errorsFile);
console.info("Trying to log data now!") // Debugging logging
logOut.log("========== Server Startup Initiated ==========");
logOut.log(rightNow() + "Server Directory: "+ __dirname);
logOut.log(rightNow() + "Logs directory: "+__logdir);
// Here is the Swagger portion that seems to create the behaviour.
// It is unedited from the Swagger Code-Gen tool
// swaggerRouter configuration
var options = {
routing: {
controllers: path.join(__dirname, './controllers')
},
};
var expressAppConfig = oas3Tools.expressAppConfig(path.join(__dirname, '/api/openapi.yaml'), options);
var app = expressAppConfig.getApp();
// Initialize the Swagger middleware
http.createServer(app).listen(serverPort, function () {
console.info('Your server is listening on port %d (http://localhost:%d)', serverPort, serverPort);
console.info('Swagger-ui is available on http://localhost:%d/docs', serverPort);
}).on('error',console.error);
In case it helps, this is the project's file structure . I am running this project within a WSL instance in VSCode on Windows, same as I have with other projects using fs.
Is anyone able to help me understand why fs can write the first log line but then break once the Swagger code gets going? Have I done something incredibly stupid?
Appreciate the help, thanks!
Edit: Tried to fix broken images.
Found the problem with some help from a friend. The issue boiled down to a lack of understanding of how the Swagger module works in the background, so this will likely be eye-rollingly obvious to most, but keeping this post around in case anyone else comes across this down the line.
So it seems that as part of the Swagger initialisation, any scripts within the utils folder will also be executed. I would not have picked up on this if it wasn't pointed out to me that in the middle of the console output there was a reference to some PostgreSQL code, even though I had taken all reference to it out of the main index.js file.
That's when I realised that the error wasn't actually being generated from the code posted above: it was being thrown from to that folder.
So I guess the answer is don't add stuff to the utils folder, but if you do, always add a bunch of console logging...

Creating a new file inside a path in nodejs

I have a path in the file system, say
/home/user/lannister/wines/red/knowthings
Inside lannister folder, my nodejs code resides. What the code is supposed to do, is, to create a blank file called debtslist.txt which should be located inside knowthings folder, so that the file path becomes
/home/user/lannister/wines/red/knowthings/debtslist.txt
I tired several ways of doing it, that include:
const filePath = 'wines/red/knowthings/debtslist.txt';
fs.openSync(filePath, 'w');
and
const filePath = 'wines/red/knowthings/debtslist.txt';
fs.writeFile(filePath, '', function (err) {
if (err) throw err;
});
But every time I am getting this error:
Error
Error: ENOENT: no such file or directory, open 'wines/red/knowthings/debtslist.txt'
Am I specifying the path in const filePath in a wrong way?

How to read a file on the Meteor's backend?

For some reason, I need to modify my mongodb with brute force.
the expected data is in a file, and I need to update the mongodb's value by the read-out file stream. with node.js' help, I generate codes like this,
const fs = require('fs');
fs.open('./f.csv', 'r', (err, fd) => {
if(!err) {
fs.readFile('./server/f.csv', 'utf8', (err,data)=>{console.log(data);});
}
});
But, now I have a difficulty to find the file. the execution throws an error:
{ Error: ENOENT: no such file or directory, open './f.csv' errno: -2, code: 'ENOENT', syscall: 'open', path: './f.csv' }
I have tried to locate the file in the Meteor's public folder or server folder, which is also Meteor's backend, but the efforts are in vain. So how to make the codes find the file on Meteor's backend?
Any suggestion is welcome.
Easiest solution is to put the file in /private and access it using the Assets module:
https://docs.meteor.com/api/assets.html
Example: If you put the file in /private/f.csv
const data = Assets.getText('f.csv');
console.log(data)
// ... Do something with that data

fswebcam: getting a dataURI via Node.js

I have fswebcam running on a Raspberry Pi. Using the command line, this saves JPG images.
I now want to receive these images in a Node.js application, and send them on to be used in a browser via dataURI.
On Node.js, I do:
var exec = require('child_process').exec;
exec("fswebcam -d /dev/video0 -r 160x120 --no-banner --save '-'", function(err, stdout, stderr) {
var imageBase64 = new Buffer(stdout).toString('base64');
I then send imageBase64 to the browser.
In the browser, setting the received data as the data URI fails:
image.src = "data:image/jpg;base64," + imageBase64;
Doing the above with a data URI created from a stored JPG created by fswebcam (via an online generator) works fine.
What am I not seeing here regarding formats and encodings?
The Content-Type should probably be image/jpeg and not image/jpg.
Also, the new Buffer(stdout) is redundant since stdout is already a Buffer, so you can just do stdout.toString('base64').
Lastly, if it's the data itself that is bad, you can double-check your base64-encoded output with this webpage or by writing stdout to disk and using the file command on it to ensure it's intact.
A little late but as I had same problem while playing with fswebcam from Node, the correct way would be to either use spawn and listen to "data" events on the spawned child's stdout stream. Or if you use exec then pass the encoding to be "buffer" or null as then the stdout argument to the callback will be again a Buffer instance, because otherwise by default it's utf-8 encoded string as stated in the exec docs
child_process.exec("fswebcam -", { encoding: "buffer" }, (error, stdout, stderr) => {
// stdout is Buffer now;
});

Categories

Resources