I have some markdown files inside /markdown folder. I am trying to read content of these files. I can see the file names inside the array. But when I try to read it, it doesn't return any data or error. What needs to be done here?
app.get("/", async(req, res) => {
const mdPath = "...path"
const data = await fs.readdirSync(mdPath);
console.log(data) // Return Array of files
for (let i = 0; i <= data.length; i++) {
const fileContent = fs.readFileSync(i, "utf-8");
return fileContent;
}
})
You should use something like path() to better handle the filesystem side.
This could work your way:
const fs = require('fs') // load nodejs fs lib
const path = require('path') // load nodejs path lib
const mdPath = 'md' // name of the local dir
const data = fs.readdirSync(path.join(__dirname, mdPath)) //join the paths and let fs read the dir
console.log('file names', data) // Return Array of files
for (let i = 0; i < data.length; i++) {
console.log('file name:', data[i]) // we get each file name
const fileContent = fs.readFileSync(path.join(__dirname, mdPath, data[i]), 'utf-8') // join dir name, md folder path and filename and read its content
console.log('content:\n' + fileContent) // log its content
}
I created a folder ./md, containing the files one.md, two.md, three.md. The code above logs their content just fine.
>> node .\foo.js
file names [ 'one.md', 'three.md', 'two.md' ]
file name: one.md
content:
# one
file name: three.md
content:
# three
file name: two.md
content:
# two
Note that there is no error handling for anything that could go wrong with reading files.
Related
Currently trying to download image from GitHub locally. Everything seems to work, the fetch goes through with a 200 OK response, however, I don't understand how to store image itself:
const rawGitLink = "https://raw.githubusercontent.com/cardano-foundation/CIPs/master/CIP-0001/CIP_Flow.png"
const folder = "/Folder"
const imageName = "/Test"
const imageResponse = await axios.get(rawGitLink)
fs.writeFileSync(___dirname + folder + imageName, imageResponse, (err) => {
//Error handling
}
)
Four problems had to be fixed:
Image name must include png format for this case
The response must be in the correct format as a buffer for an image
You must write the response data and not the object itself
__dirname only needs two underscores
const rawGitLink = "https://raw.githubusercontent.com/cardano-foundation/CIPs/master/CIP-0001/CIP_Flow.png"
const folder = "/Folder"
const imageName = "/Test.png"
const imageResponse = await axios.get(rawGitLink, { responseType: 'arraybuffer' });
fs.writeFileSync(__dirname + folder + imageName, imageResponse.data)
Axios returns a special object: https://github.com/axios/axios#response-schema
let {data} = await axios.get(...)
await fs.writeFile(filename, data) // you can use fs.promises instead of sync
As #Leau said you should include the extension on the filename
Another sugestion is to use the path module to create the filename:
filename = path.join(__dirname, "/Folder", "Test.png")
I have an Azure function and a file called configAPI.json which are located in the same folder as shown in the image below.
I want to read the latter with the following code based on this post How can i read a Json file with a Azure function-Node.js but the code isn't working at all because when I try to see if there's any content in the configAPI variable I encounter undefined:
module.exports = async function (context, req) {
const fs = require('fs');
const path = context.executionContext.functionDirectory + '//configAPI.json';
configAPI= fs.readFile(path, 'utf-8', function(err, data){
if (err) {
context.log(err);
}
var result = JSON.parse(data);
return result
});
for (let file_index=0; file_index<configAPI.length; file_index++){
// do something
}
context.log(configAPI);
}
What am I missing in the code to make sure I can read the file and use it in a variable in my loop?
functionDirectory - give you path to your functionS app then you have your single function
I think you should do:
const path = context.executionContext.functionDirectory + '\\configAPI.json';
In case you want to parse your json file you should have:
const file = JSON.parse(fs.readFileSync(context.executionContext.functionDirectory + '\\configAPI.json'));
PS. context has also variable functionName so other option to experiment would be:
const path = context.executionContext.functionDirectory +
+ '\\' +context.executionContext.functionName + '\\configAPI.json';
I am trying to parse a YAML file. I was able to parse the file properly but the comments in the YAML file are not getting read. Is there any way to do it? Attaching the parser code and config.json. Also attaching the screenshot of the file and output for reference.
var fs= require('fs');
var path= require('path');
var yaml = require('js-yaml')
var fname= "config.json"
var jPath= path.join(__dirname,"..","ConfigGen","Config",fname);
var jsString= fs.readFileSync(jPath, 'utf8')
// Get path for files from Config file
var tType= "cto" //Get this from input
var pth= JSON.parse(jsString)[tType] //perform error handling
var cType = "jbod" //Get this from input
//Use that path
fs.readdir(pth, function(err,files) {
files.forEach(function(file){
fName= cType+"_"+tType+"_uut.yaml-example";
if(file==fName){
var flContent= fs.readFileSync(path.join(pth,file),"utf8")
// return path.join from here and use the next part in a separate function
var data= yaml.safeLoad(flContent)[0][0]
console.log(data)
for (var index in data){
var prefix = index
for (idx in data[index]){
//console.log(prefix, idx ,data[prefix][idx])
}
}
}
})
})
Reiterating flyx's comment, according to the YAML spec on comments:
Comments are a presentation detail and must not be used to convey content information.
So assuming you're not going to be able to correlate the comments to any adjacent fields, you can just read the whole file as a string and match against any characters after a #
You can read the file and parse with this regex like this:
var { promises: fs } = require('fs');
(async() => {
let path = "./config.yaml"
let file = await fs.readFile(path, "utf8")
let matches = file.matchAll(/#.*/g)
let comments = [...matches].map(m => m[0])
console.log(comments)
})()
If you have a yaml file that looks like this:
# block comment
env: dev
prop: value # inline comment
It will log the following:
[ '# block comment', '# inline comment' ]
I have a directory full of txt files containing json content. I would like to read the whole directory and rename the files according to the json tag value label.
I know how to read a single file using the below code but how do you read a whole directory?
function readTextFile(file) {
var rawFile = new XMLHttpRequest();
rawFile.open("GET", file, false);
rawFile.onreadystatechange = function () {
if (rawFile.readyState === 4) {
if (rawFile.status === 200 || rawFile.status == 0) {
var allText = rawFile.responseText;
alert(allText);
}
}
}
rawFile.send(null);
}
These code gives you list of your files in folder:
var fs = require('fs');
var files = fs.readdirSync('/assets/photos/');
Then you can iterate these list and do your code.
Using the node filesystem (fs) module you can do what you want assuming it's all locally accessible and you have permissions. Here's a way it could work:
const fs = require("fs");
const dir = "/path/to/the/directory";
// get the directory contents
const files = fs.readdirSync(dir);
for (const file of files) {
// for each make sure it's a file (not a subdirectory)
const stat = fs.statSync(file);
if (stat.isFile()) {
// read in the file and parse it as JSON
const rawdata = fs.readFileSync(file);
try {
const json = JSON.parse(rawdata);
if (json.label) {
// build the new filename using 'label'
const newfile = `${dir}/${label}.json`;
fs.renameSync(file, newfile)
}
}
catch (err) {
console.log(`Error working with ${file}. Err: ${err}`);
}
}
}
That's the idea. Additional error checking can be done for safety like making sure the new filename doesn't already exist.
I have an API method that when called and passed an array of file keys, downloads them from S3. I'd like to stream them, rather than download to disk, followed by zipping the files and returning that to the client.
This is what my current code looks like:
reports.get('/xxx/:filenames ', async (req, res) => {
var AWS = require('aws-sdk');
var s3 = new AWS.S3();
var str_array = filenames.split(',');
for (var i = 0; i < str_array.length; i++) {
var filename = str_array[i].trim();
localFileName = './' + filename;
var params = {
Bucket: config.reportBucket,
Key: filename
}
s3.getObject(params, (err, data) => {
if (err) console.error(err)
var file = require('fs').createWriteStream(localFileName);
s3.getObject(params).createReadStream().pipe(file);
console.log(file);
})
}
});
How would I stream the files rather than downloading them to disk and how would I zip them to return that to the client?
Main problem is to zip multiple files.
More specifically, download them from AWS S3 in bulk.
I've searched through AWS SDK and didn't find bulk s3 operations.
Which brings us to one possible solution:
Load files one by one and store them to folder
Zip folder (with some package like this)
Send zipped folder
This is raw and untested example, but it might give you the idea:
// Always import packages at the beginning of the file.
const AWS = require('aws-sdk');
const fs = require('fs');
const zipFolder = require('zip-folder');
const s3 = new AWS.S3();
reports.get('/xxx/:filenames ', async (req, res) => {
const filesArray = filenames.split(',');
for (const fileName of filesArray) {
const localFileName = './' + filename.trim();
const params = {
Bucket: config.reportBucket,
Key: filename
}
// Probably you'll need here some Promise logic, to handle stream operation end.
const fileStream = fs.createWriteStream(localFileName);
s3.getObject(params).createReadStream().pipe(fileStream);
}
// After that all required files would be in some target folder.
// Now you need to compress the folder and send it back to user.
// We cover callback function in promise, to make code looks "sync" way.
await new Promise(resolve => zipFolder('/path/to/the/folder', '/path/to/archive.zip', (err) => {resolve()});
// And now you can send zipped folder to user (also using streams).
fs.createReadStream('/path/to/archive.zip').pipe(res);
});
Info about streams link and link
Attention: You'll probably could have some problems with async behaviour, according to streams nature, so, please, first of all, check if all files are stored in folder before zipping.
Just a mention, I've not tested this code. So if any questions appear, let's debug together