I am trying to parse a json file and get some errors. It is in a directory under my js file with the fs in a folder called "recipes" with 3 json files all representing a seperate object.
Here's the json of all 3 that are similar:
{
"ingredients":
[
{"name":"Crab","unit":"Tsp","amount":3},
{"name":"Peas","unit":"Cup","amount":12},
{"name":"Basil","unit":"Tbsp","amount":10},
{"name":"Cumin","unit":"Liter","amount":3},
{"name":"Salt","unit":"Tbsp","amount":1}
],
"name":"Boiled Crab with Peas",
"preptime":"13",
"cooktime":"78",
"description":"A boring recipe using Crab and Peas",
"id":"b0e347d5-9428-48e5-a277-2ec114fc05a0"
}
My code is this: It gives an unexpected JSON position 1
fs.readdirSync("./recipes").forEach(file =>{
//let rec = JSON.parse(file);
console.log(JSON.parse(file))
})
readdirSync could return name string, binary, or dirent object. Neither is the file contents. The custom readFiles is what you need.
const fs = require('fs')
const path = require('path')
const ROOT_DIR = './contents'
const readFiles = (dir, cb) => {
try {
fs.readdirSync(dir).forEach(file =>{
fs.readFile(path.join(dir, file), 'utf-8', cb)
})
} catch (e) {
console.log(`Failed to open the directory ${e.path} `)
}
}
readFiles(ROOT_DIR, (err, data) => {
if (err) {
console.log(`Failed to read file: ${err.path}`)
}
console.log(JSON.parse(data))
})
You can try the below solution. I am not sure if I correctly passed the file path in readFile method, but it should work if the path is correct.
See this post
fs.readdirSync("./recipes").forEach(file => {
fs.readFile(file, 'utf8', (err, data) => { // not sure if file is the filePath
if (err) {
console.log(`Error reading file from disk: ${err}`);
} else {
// parse JSON string to JSON object
console.log(JSON.parse(data));
}
});
});
Related
I'm trying to start a script that itself creates a model file in json using fs.writeFile. The problem is when I run the script using node file.js. It is supposed to create a new file face-expression-model.json in directory /models but it doesn't create anything and doesn't show any errors.
I tried to use another library fs-extra not working as well, tried to make the script to create model directory fs.WriteDir not working eitheritried to add process.cwd() to bypass any authorisation when creating the file but didn't work. I also tried to add try/catch block to catch all errors but it doesn't show any errors and it appears that the file was created for the first while but NOPE, unfortunately.
Here is the code I'm using.
const axios = require("axios");
const faceapi = require("face-api.js");
const { FaceExpressions } = faceapi.nets;
const fs = require("fs");
async function trainModel(imageUrls) {
try {
await FaceExpressions.loadFromUri(process.cwd() + "/models");
const imageTensors = [];
for (let i = 0; i < imageUrls.length; i++) {
const response = await axios.get(imageUrls[i], {
responseType: "arraybuffer"
});
const image = new faceapi.Image();
image.constructor.loadFromBytes(new Uint8Array(response.data));
const imageTensor = faceapi.resizeImageToBase64Tensor(image);
imageTensors.push(imageTensor);
}
const model = await faceapi.trainFaceExpressions(imageTensors);
fs.writeFileSync("./models/face-expression-model.json", JSON.stringify(model), (err) => {
if (err) throw err;
console.log("The file has been saved!");
});
} catch (error) {
console.error(error);
}
}
const imageUrls = [
array of images urls here
];
trainModel(imageUrls);
I don't know exactly why but I had the same problem a while ago. Try using the "fs.writeFile" method. It worked for me.
fs.writeFile("models/face-expression-model.json", JSON.stringify(model), {}, (err) => {
if (err) throw err;
console.log("The file has been saved!");
});
Good luck with that!
I wrote a code which help me to read all the folder file and make me store them in array format so my code looks like this
readAll.js
module.exports = readAllFile = () => {
const arr = [];
fs.readdir(path.join("./admin/requiredFiles"), (err, fileNames) => {
if (err) throw console.log(err.message);
// Loop fileNames array
fileNames.forEach((filename) => {
// Read file content
fs.readFile(
path.join("./admin/requiredFiles", `./${filename}`),
(err, data) => {
if (err) throw console.log(err.message);
// Log file content
arr.push(JSON.parse(data));
fs.writeFileSync(
path.join("./admin/execuetedFile", `config.json`),
`${JSON.stringify(arr)}`,
(err) => {
if (err) throw console.log(err.message);
}
);
}
);
});
});
};
so this help me to read all the file which is present in admin/requiredFiles and let me save those file in executedFile
but the problem is this help me to store in array format but I want to store data in object form
suppose this is my few file data
file1.json
{
"admin":{
"right":"yes",
"permission":"available"
},
"admin_power":{
"addUser":"available",
"deleteUser":"available"
}
}
file2.json
{
"directory":{
"right":"yes",
"permission":"modified"
},
"directory_power":{
"add_directory":"yes",
"assign_directory":"yes"
}
}
so this are my some sample file and it help me to save them in format
config.json
[
{
"admin":{
"right":"yes",
"permission":"available"
},
"admin_power":{
"addUser":"available",
"deleteUser":"available"
}
},
{
"directory":{
"right":"yes",
"permission":"modified"
},
"directory_power":{
"add_directory":"yes",
"assign_directory":"yes"
}
}
]
and I don't want this in this array form I want this copied files look like this
expectation config.json
{
"admin":{
"right":"yes",
"permission":"available"
},
"admin_power":{
"addUser":"available",
"deleteUser":"available"
},
"directory":{
"right":"yes",
"permission":"modified"
},
"directory_power":{
"add_directory":"yes",
"assign_directory":"yes"
}
}
I just wanted to know what changes should I do so I can get output in my expectation format in config,json
You can use Object.keys() instead of Array.push().
The solution is when you read each files data, you should use Object.keys method and then loop through all available keys and add it to your output object.
module.exports = readAllFile = () => {
const output = {};
fs.readdir(path.join("./admin/requiredFiles"), (err, fileNames) => {
if (err) throw console.log(err.message);
// Loop fileNames array
fileNames.forEach((filename) => {
// Read file content
fs.readFile(
path.join("./admin/requiredFiles", `./${filename}`),
(err, data) => {
if (err) throw console.log(err.message);
// Log file content
const parsedData = JSON.parse(data);
for (let key of Object.keys(parsedData) {
output[key] = parsedData[key];
}
fs.writeFileSync(
path.join("./admin/execuetedFile", `config.json`),
`${JSON.stringify(output)}`,
(err) => {
if (err) throw console.log(err.message);
}
);
}
);
});
});
};
Use an object to collect the objects in the files, Object.assign() (doc here) will compound the parsed data...
// use an object (probably should rename)
let arr = {};
// rather than push, assign
Object.assign(arr, JSON.parse(data));
Again, probably better to rename arr to something like result. Remember to refer to that new variable name in the fs.writeFile.
Actually I am trying to do zip conversion and need to save zip at particular folder as zip_folder created with my project folder. This is happening when I call some api. I can't able to do but if I use __dirname its working properly. Can anyone help me to comeout from this by giving your solutions. Thank you.
const fs = require('fs');
const archiver = require('archiver');
var file1 = '../zip_folder/scorm.zip';
var onlyPath = require('path').dirname('C:\Users\is9115\Desktop\node_moodle');
const mysql = require('../shared/connection');
// create a file to stream archive data to.
const archive = archiver('zip', {
zlib: { level: 9 } // Sets the compression level.
});
async function createzip()
{
const output = fs.createWriteStream(file1); // this is not working at file location
const output = fs.createWriteStream(__dirname+'/scorm.zip');//working but creating at root folder itself
fs.readFile('imsmanifest.xml', 'utf-8',async function(err, data) {
if (err) throw err;
var newValue = data.replace(/Var_Title/g, 'Golf');
fs.writeFile('imsmanifest1.xml', newValue, 'utf-8', function(err, data) {
if (err) throw err;
console.log('Done!');
})
})
archive.pipe(output);
const file2 = __dirname + '/imsmanifest1.xml';
archive.append(fs.createReadStream(file2), { name: 'imsmanifest.xml' });
archive.append('string cheese!', { name: 'file2.txt' });
archive.directory('scorm12schemadefinition/', false);
archive.file('imsmainfest1.xml', { name: 'imsmanifest.xml' });
archive.finalize();
}
I would check if a ZIP file is corrupted using NodeJS using less CPU and memory as possible.
How to corrupt a ZIP file:
Download a ZIP file
Open the ZIP file using a text editor optimized like Notepad++
Rewrite the header. Only put random characters.
I am trying to reach this goal using the NPM library "node-stream-zip"
private async assertZipFileIntegrity(path: string) {
try {
const zip = new StreamZip.async({ file: path });
const stm = await zip.stream(path);
stm.pipe(process.stdout);
stm.on('end', () => zip.close());
} catch (error) {
throw new Error();
}
}
However, when I run the unit tests I receive an error inside an array:
Rejected to value: [Error]
import zip from 'yauzl';
import path from 'path';
const invalidZipPath = path.resolve('invalid.zip');
const validZipPath = path.resolve('valid.zip');
const isValidZipFile = (filePath) => {
return zip.open(filePath, { lazyEntries: true }, (err, stream ) => {
if (err) {
console.log('fail to read ', filePath);
return false;
}
console.log('success read ', filePath);
return true;
});
}
isValidZipFile(validZipPath);
isValidZipFile(invalidZipPath);
Readstreams for firebase storage:
I have files in my Google firebase storage for which I want to create a read stream (using javascript/node js). (I then intend to pipe this read stream to some middleware and then a write stream, but this is unimportant for the question.) The code snippet shows what I'm doing, but when I print the readStream to console I get a DestroyableTransform object instead of a ReadableStream. I feel like my code is very similar to the documentation. Does anyone know what might be wrong?
const filePath = 'image.png';
const getReadStream = (filePath) => {
let file;
try {
file = admin
.storage()
.bucket()
.file(filePath);
} catch (err) {
console.log(err);
}
const readStream = file.createReadStream()
.on('error', (err) => {
throw err;
});
console.log(readStream);
return readStream;
};
This is a possible answer.
const filePath = 'image.png';
const getReadStream = (filePath) => {
let file;
try {
file = admin
.storage()
.bucket(filePath);
} catch (err) {
console.log(err);
}
const readStream = file.createReadStream()
.on('error', (err) => {
throw err;
});
console.log(readStream);
return readStream;
};
You should exclude the inner "file" sentence.