How to write to a file using node - javascript

I'm trying to write out some html to a file as follows but I keep getting an error:
var date = new Date(), year = date.getFullYear(), month = date.getMonth() + 1, day = date.getDate();
var folderName = year + '-' + month + '-' + day;
var path = "public/" + folderName;
fs.ensureDir(path).then(() => {
console.log('success!') }).
catch(err => { console.error(err) })
path = path + '/' + date.getTime() + ".html";
let html = await page.evaluate(() =>
document.body.innerHTML);
require('fs').write(path, html, "w");
The error:
UnhandledPromiseRejectionWarning: TypeError: First argument must be
file descriptor
at Object.fs.write
How can I resolve this?

You should use fs.writeFile() instead of fs.write().
Note that if there is a file present with the same name it will be replaced with the new one. But given your naming convention this should never happen.

looking at your example, you should us fs.writeFile:
const fs = require('fs');
fs.writeFile('YOUR_PATH', 'YOUR_HTML_TEXT', (err) => {
if (err) {
return console.error(err);
}
console.log('file created');
});
Just to add more info, the function you were trying to use:
fs.write(fd, string[, position[, encoding]], callback)
Expects a file descriptor, this is an identifier returned for example by method open:
fs.open(path, flags[, mode], callback)

Related

Node/Javascript loop synchronous

I have followed multiple posts and guides about using async / await, tried multiple synchronous request libraries, I have tried promises, then blocks, callbacks, using regular loops as opposed to foreach loops, and I have run out of ideas.
This code calls the Zoom API to get a list of cloud recordings that I would like to download. Each user can have 100-200 and they are large files, so there is a limit to how many connections either my end or Zoom's end can handle without getting an understandable "RequestError: socket hang up".
Because of how the Zoom API works you can only get results for one month a t a time. So I am looping over users, then looping over months, then calling the individual URLs for stream download to a file on my workstation.
All I want this code to do is process the recordings for a single user in a single month, and WAIT until they have all downloaded, before moving onto the month and then eventually the next user.
Can anyone suggest how I might be able to accomplish that?
const fs = require("fs");
const path = require("path");
const http = require("follow-redirects").http;
const https = require("follow-redirects").https;
var syncrequest = require("sync-request");
const got = require("got");
const stream = require("stream");
const { promisify } = require("util");
(async function getFiles() {
var token = "TOKEN_HERE";
var zoomUserIDs;
var users = ["EMAILS_HERE","EMAILS_HERE"];
var dates = [
[9, 2020],
[10, 2020],
[11, 2020],
[12, 2020],
[1, 2021],
[2, 2021],
[3, 2021],
[4, 2021],
[5, 2021],
[6, 2021]
];
var path_to_save = "";
var folder_name = "";
//Loop over list of users
for (const user of users) {
console.log(user);
folder_name = "output/" + user;
if (!fs.existsSync(folder_name)) {
fs.mkdir(path.join("", folder_name), err => {
if (err) {
return console.error(err);
}
console.log(user + " Directory created successfully!");
});
}
//Loop over months
for (const date of dates) {
var url =
"https://api.zoom.us/v2/users/" +
user +
"/recordings?from=" +
date[1] +
"-" +
date[0] +
"-01&to=" +
date[1] +
"-" +
date[0] +
"-31&page_size=100";
//Call Zoom API
const res = await got.get(url, {
responseType: "json",
headers: {
Authorization: "Bearer" + token
}
});
//Get individual file url
var reponse = res.body;
if (reponse.meetings) {
for (const meeting of reponse.meetings) {
for (const recording of meeting.recording_files) {
if (
recording.recording_type == "shared_screen_with_speaker_view" ||
recording.recording_type == "shared_screen" ||
recording.recording_type == "active_speaker"
) {
var path_to_zoom_recording = recording.download_url + "?access_token=" + token;
//Dowload file
const pipeline = promisify(stream.pipeline);
path_to_save =
folder_name +
"/" +
meeting.topic.replaceAll("/", "_") +
"_" +
meeting.start_time.replaceAll(":", "-") +
".mp4";
if (!fs.existsSync(path_to_save)) {
(async () => {
console.log(
meeting.topic + " -- " + meeting.start_time + " -- START"
);
await pipeline(
got.stream(path_to_zoom_recording),
fs.createWriteStream(path_to_save)
).then(() =>
console.log(
meeting.topic + " -- " + meeting.start_time + " -- END"
)
);
})();
}
}
}
}
}
}
}
})();
I think if you get rid of the first and last line of this section you will be most of the way there
(async () => {
console.log(
meeting.topic + " -- " + meeting.start_time + " -- START"
);
await pipeline(
got.stream(path_to_zoom_recording),
fs.createWriteStream(path_to_save)
).then(() =>
console.log(
meeting.topic + " -- " + meeting.start_time + " -- END"
)
);
})();
The way you have it, your upper level isn't awaiting the pipeline.
Also you are this issue near the top:
for (const user of users) {
console.log(user);
folder_name = "output/" + user;
if (!fs.existsSync(folder_name)) {
fs.mkdir(path.join("", folder_name), err => {
if (err) {
return console.error(err);
}
console.log(user + " Directory created successfully!");
});
}
the fs.mkdir command is also not awaited. You can wrap that in promisify, and then await that too, just like you did below with another function.
It seems weird to mix it async/await with existsSync, but it probably won't hurt, since you only have one thing happening at once. It would be nice to also promisfy fs.exists, and then await that.
In newer node.js there is a pre-promisified version of all these methods. You import them from 'fs/promises' I think. https://nodejs.org/api/fs.html#fs_promise_example
also, for what it's worth, you only have to promisify a function once. This shouldn't really be in the loop: const pipeline = promisify(stream.pipeline); But node.js probably doesn't mind doing it every iteration, so that is not the issue you are looking for.

Wait for fetch response to continue in for loop. Javascript Nodejs

I have a function that connect to a web service in SOAP. Unfortunately the web service only support a very limited connections. I have an array of items to search in the web service, if i do a for or a foreach loop, the 70% of cases complete with no error, but in the 30% the web service response a error. This occurs when the max connections is overflow. This happens because the loop is no waiting the response of the webservice and the loop cotinues creating a lot of connections.
Here's my code:
var promiseArray = [];
for (var i = 0; i < result.length; i++) {
let m = result[i].id
let xml = '<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:tem="http://tempuri.org/">' +
'<soapenv:Header/>' +
'<soapenv:Body>' +
'<tem:EjecutarConsultaXML>' +
'<!--Optional:-->' +
'<tem:pvstrxmlParametros>' +
'<![CDATA[' +
'<Consulta><NombreConexion>USERNAME</NombreConexion>' +
'<IdConsulta>QUERY</IdConsulta>' +
'<Parametros>' +
'<doc>' + m + '</doc>' +
'</Parametros>' +
'</Consulta>' +
']]>' +
'</tem:pvstrxmlParametros>' +
'</tem:EjecutarConsultaXML>' +
'</soapenv:Body>' +
'</soapenv:Envelope>';
const options = {
explicitArray: true
};
promiseArray.push(new Promise(async(resolve, reject) => {
await axios.post(url, xml, {
headers: {
'Content-Type': 'text/xml;charset=UTF-8'
}
})
.then((data) => {
xml2js.parseString(data.data, options, (err, result) => {
var temp = (result['soap:Envelope']['soap:Body'][0]['EjecutarConsultaXMLResponse'][0]['EjecutarConsultaXMLResult'][0]['diffgr:diffgram'][0]['NewDataSet'][0]['Resultado'])
resolve({
doc: m,
state: temp[0].f430_ind_estado[0]
})
});
})
.catch((err) => {
console.log(err)
});
}))
}
res.send(await Promise.all(promiseArray))
There are several issues with your code within the call to promiseArray.push().
There is no need to create a new Promise() since axios already provides one
This is actually and antipattern
There is no need for async/await in that call for the same reason.
Mixing Promises and functions that use callbacks usually doesn't turn out too well
You have no error checking in your code if the XML parser fails
The option object is not required as explicitArray: true is the default
Changes:
Removed all the extra/uneeded Promise code
Replaced xml2js.parseString with xml2js.parseStringPromise
Changed resolve to return
Since you're simply console.log() the error, removed unecessary boilerplate
Everything else is OK as written. Please let me know if I've missed something.
promiseArray.push(
axios.post(url, xml, {
headers: {
'Content-Type': 'text/xml;charset=UTF-8'
}
})
.then(data=>data.data)
.then(xml2js.parseStringPromise)
.then(result => {
var temp = result['soap:Envelope']['soap:Body'][0]['EjecutarConsultaXMLResponse'][0]['EjecutarConsultaXMLResult'][0]['diffgr:diffgram'][0]['NewDataSet'][0]['Resultado'];
return {
doc: m,
state: temp[0].f430_ind_estado[0]
};
});
.catch(console.log)
);
Just do it one by one, using async/await to do that, this means you have to use parseStringPromise instead.
var response = [];
for (var i = 0; i < result.length; i++) {
let m = result[i].id
let xml = '<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:tem="http://tempuri.org/">' +
'<soapenv:Header/>' +
'<soapenv:Body>' +
'<tem:EjecutarConsultaXML>' +
'<!--Optional:-->' +
'<tem:pvstrxmlParametros>' +
'<![CDATA[' +
'<Consulta><NombreConexion>USERNAME</NombreConexion>' +
'<IdConsulta>QUERY</IdConsulta>' +
'<Parametros>' +
'<doc>' + m + '</doc>' +
'</Parametros>' +
'</Consulta>' +
']]>' +
'</tem:pvstrxmlParametros>' +
'</tem:EjecutarConsultaXML>' +
'</soapenv:Body>' +
'</soapenv:Envelope>';
const options = {
explicitArray: true
};
try {
var { data } = await axios.post(url, xml, { // extract data from data.data
headers: {
'Content-Type': 'text/xml;charset=UTF-8'
}
})
var xmlObject = await xml2js.parseStringPromise(data)
var temp = (xmlObject['soap:Envelope']['soap:Body'][0]['EjecutarConsultaXMLResponse'][0]['EjecutarConsultaXMLResult'][0]['diffgr:diffgram'][0]['NewDataSet'][0]['Resultado'])
response.push({
doc: m,
state: temp[0].f430_ind_estado[0]
}) // push item to result array
} catch (error) {
console.log(error);
}
}
res.send(result) // send the result to client

Write javascript array elements to file

I am trying to have a node js script write some coordinates to a csv file for use in a Newman CLI script. I have the following:
const axios = require('axios');
var move_decimal = require('move-decimal-point');
var sLat = 45.029830;
var sLon = -93.400891;
var eLat = 45.069523;
var eLon = -94.286001;
var arrLatLon = []
axios.get('http://router.project-osrm.org/route/v1/driving/' + sLon + ',' + sLat + ';' + eLon + ',' + eLat + '?steps=true')
.then(function (response) {
for (let i = 0; i < response.data.routes[0].legs.length; i++) {
//console.log(response.data)
for (let ii = 0; ii < response.data.routes[0].legs[i].steps.length; ii++) {
//console.log('leg ' + i + " - step " + ii + ": " + response.data.routes[0].legs[i].steps[ii].maneuver.location[1] + "," + response.data.routes[0].legs[i].steps[ii].maneuver.location[0]);
// Declaring Latitude as 'n' & Longitude as 'nn' for decimal calculations
var n = response.data.routes[0].legs[i].steps[ii].maneuver.location[1]
var nn = response.data.routes[0].legs[i].steps[ii].maneuver.location[0]
// Latitude calculatiuons to make 'lat' values API friendly
var y = move_decimal(n, 6)
var p = Math.trunc(y);
// Longitude calculations to make 'lon' values API friendly
var yy = move_decimal(nn, 6)
var pp = Math.trunc(yy);
arrLatLon.push(p + "," + pp);
}
console.log(arrLatLon)
}
})
I have been looking through and trying numerous different tutorials/code snippets regarding writing the array elements from arrLatLon to an output file on my local machine, but none have been successful. The current code outputs the lat,lon correctly, console.log(arrLatLon) outputs:
[ '45029830,-93400894',
'44982812,-93400740',
'44977444,-93400530',
'44973116,-93410884',
'44971101,-93450400',
'45035514,-93766885',
'45035610,-93766886',
'45081631,-94286752',
'45070849,-94282026' ]
any help would be greatly appreciated. Thanks.
With nodejs you can easily write files using the fs module
const fs = require('fs');
fs.writeFile("/tmp/test", "Hey there!", function(err) {
if(err) {
return console.log(err);
}
console.log("The file was saved!");
});
in your case you can simply do something like
const fs = require('fs');
// I'm converting your array in a string on which every value is
// separated by a new line character
const output = arrLatLon.join("\n");
// write the output at /tmp/test
fs.writeFile("/tmp/test", output, function(err) {
if(err) {
return console.log(err);
}
console.log("The file was saved!");
});
Let me forward you to this question for more information Writing files in Node.js

Node.js with Restler to return a value?

This is very early in my Node and JavaScript learning. Ideally, what I am attempting to do is create a small module querying a specific type of rest endpoint and returning a specific feature based on an attribute query. The module is correctly logging out the result, but I am struggling to get the .findById function to return this result. Although aware it has something to do with how the callbacks are working, I am not experienced enough to be able to sort it out yet. Any help, advice and direction towards explaning the solution is greatly appreciated.
// import modules
var restler = require('restler');
// utility for padding zeros so the queries work
function padZeros(number, size) {
var string = number + "";
while (string.length < size) string = "0" + string;
return string;
}
// create feature service object
var FeatureService = function (url, fields) {
// save the parameters
this.restEndpoint = url;
this.fields = fields;
var self = this;
this.findById = function (idField, value, padZeroLength) {
var options = {
query: {
where: idField + '=\'' + padZeros(value, padZeroLength) + '\'',
outFields: this.fields,
f: "pjson"
},
parsers: 'parsers.json'
};
var url = this.restEndpoint + '/query';
restler.get(url, options).on('complete', function(result){
if (result instanceof Error){
console.log('Error:', result.message);
} else {
console.log(result); // this log result works
self.feature = JSON.parse(result);
}
});
return self.feature;
};
};
var restEndpoint = 'http://services.arcgis.com/SgB3dZDkkUxpEHxu/ArcGIS/rest/services/aw_accesses_20140712b/FeatureServer/1';
var fields = 'nameRiver,nameSection,nameSectionCommon,difficulty,diffMax';
var putins = new FeatureService(restEndpoint, fields);
var feature = putins.findById('awid_string', 1143, 8);
console.log(feature); // this log result does not
//console.log('River: ' + feature.attributes.nameRiver);
//console.log('Section: ' + feature.attributes.nameSection + ' (' + feature.attributes.nameSectionCommon + ')');
//console.log('Difficulty: ' + feature.attributes.difficulty);
So, I sorted out how to insert a callback from a previous thread. It appears it is just passed in as a variable and called with expected parameters. However, I now wonder if there is a better way to accept parameters, possibly in the form of options. Any advice in this regard?
// import modules
var restler = require('restler');
// utility for padding zeros so the queries work
function padZeros(number, size) {
var string = number + "";
while (string.length < size) string = "0" + string;
return string;
}
// create feature service object
var FeatureService = function (url, fields) {
// save the parameters
this.restEndpoint = url;
this.fields = fields;
var self = this;
// find and return single feature by a unique value
this.findById = function (idField, value, padZeroLength, callback) {
// query options for
var options = {
query: {
where: idField + '=\'' + padZeros(value, padZeroLength) + '\'',
outFields: this.fields,
f: "pjson"
},
parsers: 'parsers.json'
};
var url = this.restEndpoint + '/query';
restler.get(url, options)
.on('success', function(data, response){
var dataObj = JSON.parse(data).features[0];
console.log(dataObj);
callback(dataObj);
})
.on('fail', function(data, response){
console.log('Error:', data.message);
});
return self.feature;
};
};
var restEndpoint = 'http://services.arcgis.com/SgB3dZDkkUxpEHxu/ArcGIS/rest/services/aw_accesses_20140712b/FeatureServer/1';
var fields = 'nameRiver,nameSection,nameSectionCommon,difficulty,diffMax';
var putins = new FeatureService(restEndpoint, fields);
putins.findById('awid_string', 1143, 8, function(dataObject){
console.log('River: ' + dataObject.attributes.nameRiver);
console.log('Section: ' + dataObject.attributes.nameSection + ' (' + dataObject.attributes.nameSectionCommon + ')');
console.log('Difficulty: ' + dataObject.attributes.difficulty);
});

Renaming files fails when 2 are added simultaneously

The script below is working fantastically when monitoring a folder for new .ogg files. It successfully creates a folder with the new filename and then renames the file according to the file according to its created date/
However, the issues arise when I add multiple files at the same time as the script attempts to create a folder that already exists, suggesting it is mixing up the two filenames somehow. Has anyone any suggestions as to what I might be doing incorrectly? I presume its simple code structure although I'm not able to work out why.
var baseDir = './',
path = require('path'),
fs = require('fs');
// watch the directory for new files
fs.watch(baseDir, function(event, file) {
var ext = path.extname(file)
basename = path.basename(file).substring(0, path.basename(file).length - ext.length);
// check it wasnt a delete action
fs.exists(baseDir + file, function(exists) {
// check we have the right file type
if(exists && ext === '.ogg'){
// get the created date
fs.stat(baseDir + file, function (err, stats){
if (err)
throw err;
var year = stats.ctime.getFullYear();
var month = stats.ctime.getMonth()+1;
var day = stats.ctime.getDate();
var hour = stats.ctime.getHours();
var sec = stats.ctime.getSeconds();
if(month < 10){
month = '0' + month;
}
if(day < 10){
day = '0' + day;
}
if(hour < 10){
hour = '0' + hour;
}
if(sec < 10){
sec = '0' + sec;
}
var name = year + '' + month + '' + day + '' + hour + '' + sec;
// does the basename directory exist?
fs.exists(baseDir + '/' + basename, function(exists) {
// if the directory doesnt exist
if(!exists){
// make the directory
fs.mkdir(baseDir + '/' + basename, 0777, function (err, stats){
if (err)
throw err;
moveFile(file, basename, name, ext);
});
} else {
moveFile(file, basename, name, ext);
}
});
});
}
});
});
function moveFile(file, basename, name, ext){
// move the file to the new directory
fs.rename(baseDir + file, baseDir + '/' + basename + '/' + name + ext, function (err) {
if (err)
throw err;
// console.log('Rename complete');
});
}
Ok, so I had a few extra minutes and decided to have a look for you. I refactored your code a little, but the basic structure should be easy to recognize.
var baseDir = './test',
path = require('path'),
fs = require('fs');
// watch the directory for new files
fs.watch(baseDir, function(event, file) {
var ext = path.extname(file),
basename = path.basename(file).substring(0, path.basename(file).length - ext.length);
// check it wasnt a delete action
// check we have the right file type
var filePath = path.join(baseDir, file);
if(fs.existsSync(filePath) && ext === '.ogg'){
// get the created date
var stats = fs.statSync(filePath);
var name = getName(stats);
// if the directory doesnt exist
var baseDirPath = path.join(baseDir, basename);
if(!fs.existsSync(baseDirPath)){
// make the directory
fs.mkdirSync(baseDirPath, 0777);
}
moveFile(file, basename, name, ext);
}
});
function getName (stats) {
var year = stats.ctime.getFullYear();
var month = stats.ctime.getMonth()+1;
var day = stats.ctime.getDate();
var hour = stats.ctime.getHours();
// need minutes!
var minutes = stats.ctime.getMinutes();
var sec = stats.ctime.getSeconds();
if(month %lt 10){
month = '0' + month;
}
if(day &lt 10){
day = '0' + day;
}
if(hour &lt 10){
hour = '0' + hour;
}
if(minutes &lt 10){
minutes = '0' + minutes;
}
if(sec &lt 10){
sec = '0' + sec;
}
// missing the minute, previously
return year + '' + month + '' + day + '' + hour + '' + minutes + '' + sec;
}
function moveFile(file, basename, name, ext){
// move the file to the new directory
var src = path.join(baseDir, file),
dest = path.join(baseDir, basename, name+ext);
console.log("Moving ", src, "-", dest);
fs.renameSync(src, dest);
}
Some tips/corrections:
Stick with the synchronous fs methods that end in Sync when working on simple scripts like this. While node.js is famous for it's asynchronous ability, it's a bit of a premature optimization IMO. If you need to embed this in a high-performance webserver, for instance, optimize at that point, not before.
You were missing a minutes variable when you create the new filename. This has a pretty good chance of causing a name collision, so I corrected it.
Try to use the path library (like path.join) more to your advantage, as manually joining strings for paths can often lead to brittle code.
There are still several edge cases where this can crash. Creating a file without an extension that will have the same name as a directory you will create based on another file. (Files can't become directories, and you can't move a file inside another file.). If you plan to go into a production environment, you will want to harden the code with at least a few unit tests.
Cheers,
Dan

Categories

Resources