I want to browse directories for json files and merge all json files into one json file per directory.
My javascript skill is at beginner level.
The javascript code needs to run by calling it in the Terminal, not in the browser.
you can make use of glob npm package to retrieve all json file paths in the directories.
glob : https://www.npmjs.com/package/glob
you can simply go with :
const glob = require("glob");
const fs = require('fs-extra');
async function processJsonData(){
let parentDirectoryPath = '<specify path here>';
let data = {};
let jsonFilePaths = glob.sync(`${parentDirectoryPath}/*/*.json`);
//this will return paths for all the json files
for (const jsonFilePath of jsonFilePaths) {
let content = await fs.readJsonSync(jsonFilePath);
Object.assign(data,content);
}
await fs.writeJson(<filePath>, data, {spaces: 2});
//here specify your file path
}
ps. I am a new contributor, might have misread the question.
Related
This question already has answers here:
Proper way to reference files relative to application root in Node.JS
(5 answers)
Closed 1 year ago.
My question is, is it possible to do so that when I have a code and a text file, but when I want to open or locate the text file, I dont need to specify it everytime where it is or like when I send it to someone else, they dont need to change the let file = C:\Users\etc...
My code:
const { readFile, readFileSync } = require('fs');
let file = 'C:\\Users\\eeroj\\Desktop\\word-counter\\TextFile2.txt';
function countRepeatedWords(sentence) {
const words = sentence.split(" ").filter(word => !!word);
const wordMap = {};
for (let word of words) {
const key = word.trim().toLowerCase();
const currentWordCount = wordMap[key];
wordMap[key] = (currentWordCount ?? 0) + 1;
}
const sortedEntries = Object.entries(wordMap).sort(([a,], [b,]) => a.localeCompare(b));
const sortedWordMap = Object.fromEntries(sortedEntries);
//returning the wordMap and the sorting of the wordMap
return sortedWordMap
return wordMap;
}
words = readFileSync(file).toString();
console.log(countRepeatedWords(words));
Yes you can, through the path (built-in Nodejs) module and the __dirname constant. The __dirname environment variable will automatically get the current folder's path for you, and then you can use path.join to join the current directory with the filename you want to access. Something like this
const path = require("path");
// Assuming that the current script file are inside the 'C:\\Users\\eeroj\\Desktop\\word-counter' folder
const file = path.join(__dirname, "TextFile2.txt");
// file = 'C:\\Users\\eeroj\\Desktop\\word-counter\\TextFile2.txt'
If you put "TextFile2.txt" in a relative position not in the current folder, you can also use path.join with the folder traversal syntax such as .. to go back a folder
const path = require("path");
// Assuming that the current script file are inside the 'C:\\Users\\eeroj\\Desktop\\word-counter' folder
const file = path.join(__dirname,"../../", "TextFile2.txt");
// file = "C:\\Users\\eeroj\\TextFile2.txt"
As a result, this way should work regardless of anywhere you put the project, even on a different machine, as long as the project folder structure stays relatively the same and the "TextFile2.txt" file is in the right place relative to the script file.
If your txt file is in the same directory as the script you can simply use the relative path with a '.' as
let file = "./TextFile2.txt"
Just note that the textfile will have to always have the same name.
In a react project I am trying to list all the files in blob storage using a sas token created for the given directory. My understanding is I need to create a DataLakeFileSystemClient but I only have a url for the directory and a DataLakeDirectoryClient and somehow need to create the DataLakeFileSystemClient.
The url passed is something along the lines of: https://myaccount.dfs.core.windows.net/mycontainer/mydirectory{sastoken}
I have found a way to do this, although I don't know if it's the best way.
To get the directory client to a FileSystem client I wrote a helper method
const getFileSystemClient = (directoryClient: DataLakeDirectoryClient) => {
const url = new URL(directoryClient.url);
url.pathname = directoryClient.fileSystemName;
return new DataLakeFileSystemClient(url.toString());
}
To list directories I use the following code
const fsClient = getFileSystemClient(directoryClient);
for await(const path of fsClient.listPaths({path: directoryClient.name})) {
console.log(path.name);
}
I am having JavaScript file under menu directory in the root menu.js. I want to rename JavaScript file menu.js to menuOLD.js under same directory onClick.
I would have googled and found small sample as :
function renamefile(){
const myFile = new File(['hello-world'], 'my-file.txt');
const myRenamedFile = new File([myFile], 'my-file-final-1-really.txt');
console.log(myRenamedFile);
}
I have checked it's output in Console and got below output:
It's working.
But I would need to rename excatly menu.js file under menu directory.
How should I do this?
With Node.js
const fs = require('fs');
fs.rename('menu.js', 'menuOLD.js', (err) => {
console.log(err);
});
See here:
I've been able to write a file to a branch in a bare repository using the below code, but it only works for files in the root. I haven't been able to find a good example in the documentation of how to build a tree for a subfolder and use that as a commit.
async function writeFile(filename, buffer) {
const signature = NodeGit.Signature.now('Jamie', 'jamie#diffblue.com');
const repo = await NodeGit.Repository.openBare('java-demo.git');
const commit = await repo.getBranchCommit('master');
const rootTree = await commit.getTree();
const builder = await NodeGit.Treebuilder.create(repo, rootTree);
const oid = await NodeGit.Blob.createFromBuffer(repo, buffer, buffer.length);
await builder.insert(filename, oid, NodeGit.TreeEntry.FILEMODE.BLOB);
const finalOid = await builder.write();
await repo.createCommit('refs/heads/master', signature, signature, 'Commit message', finalOid, [commit]);
}
const buffer = new Buffer('Hello\n', 'utf-8');
writeFile('test.txt', buffer).then(() => console.log('Done'));
What modifications would be needed to post in (for example) src/test.txt, instead of test.txt?
The typical workflow for writing trees goes through the index. For example, git_index_add_frombuffer followed by git_index_write_tree. Even if you don't want to write to the repository's index on disk, you can still use the index interface by creating an in-memory index.
In a bare repository without an index, you can use git_index_new followed by git_index_read_tree to get an index initialized to the contents of your tree. Then write the tree out to the repository with git_index_write_tree_to.
I'm less familiar with the treebuilder interface, but it looks like you would have to create new subtrees recursively. For example, get or create the src subtree and insert the test.txt blob into it. Then get or create the root tree and insert the src subtree into it.
I am attempting to zip the contents of two directories and download the resulting .zip file. One directory contains .txt files and the other .jpg. I am using archiver to zip the files and running the express framework on node js. I am inclined to think that the problem exists in the download step, as the resulting zipped file that is created in the project root expands as expected, however when the file is downloaded, I get an "Error 2 - No such file or directory."
app.get('/download',function(req, res){
zipFile = new Date() + "-Backup.zip";
var output = fs.createWriteStream(__dirname +"/backups/"+ zipFile);
var archive = archiver('zip');
output.on('close', function() {
console.log(archive.pointer() + ' total bytes');
console.log('archiver has been finalized and the output file descriptor has closed.');
});
archive.on('error', function(err) {
throw err;
});
archive.pipe(output);
var files1 = fs.readdirSync(__dirname+'/posts');
var files2 = fs.readdirSync(__dirname+'/uploads');
for(var i = 0; i< files1.length; i++){
archive.append(fs.createReadStream(__dirname+"/posts/"+files1[i]), { name: files1[i] });
}
for(var i = 0; i< files2.length; i++){
archive.append(fs.createReadStream(__dirname+"/uploads/"+files2[i]), { name: files2[i] });
}
archive.finalize();
res.download(__dirname + "/backups/" + zipFile, zipFile);
});
zipFile is a global variable.
The on 'close' logs fire properly and no errors occur, but the file will not open after being downloaded. Is there an issue with response headers or something else I am unaware of?
Thanks for the help.
I solved my own problem using node-zip as the archive utility.
var zip = require('node-zip')(); // require the node-zip utility
var fs = require('fs'); // I use fs to read the directories for their contents
var zipName = "someArbitraryName.zip"; // This just creates a variable to store the name of the zip file that you want to create
var someDir = fs.readdirSync(__dirname+"/nameOfDirectoryToZip"); // read the directory that you would like to zip
var newZipFolder = zip.folder('nameOfDirectoryToZip'); // declare a folder with the same name as the directory you would like to zip (we'll later put the read contents into this folder)
//append each file in the directory to the declared folder
for(var i = 0; i < someDir.length,i++){
newZipFolder.file(someDir[i], fs.readFileSync(__dirname+"/nameOfDirectoryToZip/"+someDir[i]),{base64:true});
}
var data = zip.generate({base64:false,compression:'DEFLATE'}); //generate the zip file data
//write the data to file
fs.writeFile(__dirname +"/"+ zipName, data, 'binary', function(err){
if(err){
console.log(err);
}
// do something with the new zipped file
}
Essentially, what is happening can broken down into 3 steps:
Use fs to read the contents of a directory that you would like to zip.
Use zip.folder to declare the folder, then use zip.file to append files to that directory. I just used a for loop to iteratively add each file in the directory that was read in step 1.
Use zip.generate to create the .zip file data, and write it to file using fs.
The resulting file can be downloaded or whatever you would like to do with it. I have seen no issues using this method.
If you want to zip more than one directory, just repeat steps 1 and 2 before you zip.generate, creating a new zip.folder for each directory.
Just use
archive.on("finish",function(){
res.download(__dirname + "/backups/" + zipFile);
})