Creating readStream for file in Firebase Storage - javascript

Readstreams for firebase storage:
I have files in my Google firebase storage for which I want to create a read stream (using javascript/node js). (I then intend to pipe this read stream to some middleware and then a write stream, but this is unimportant for the question.) The code snippet shows what I'm doing, but when I print the readStream to console I get a DestroyableTransform object instead of a ReadableStream. I feel like my code is very similar to the documentation. Does anyone know what might be wrong?
const filePath = 'image.png';
const getReadStream = (filePath) => {
let file;
try {
file = admin
.storage()
.bucket()
.file(filePath);
} catch (err) {
console.log(err);
}
const readStream = file.createReadStream()
.on('error', (err) => {
throw err;
});
console.log(readStream);
return readStream;
};

This is a possible answer.
const filePath = 'image.png';
const getReadStream = (filePath) => {
let file;
try {
file = admin
.storage()
.bucket(filePath);
} catch (err) {
console.log(err);
}
const readStream = file.createReadStream()
.on('error', (err) => {
throw err;
});
console.log(readStream);
return readStream;
};
You should exclude the inner "file" sentence.

Related

having problems with `fs.writeFile` it doesn't create files

I'm trying to start a script that itself creates a model file in json using fs.writeFile. The problem is when I run the script using node file.js. It is supposed to create a new file face-expression-model.json in directory /models but it doesn't create anything and doesn't show any errors.
I tried to use another library fs-extra not working as well, tried to make the script to create model directory fs.WriteDir not working eitheritried to add process.cwd() to bypass any authorisation when creating the file but didn't work. I also tried to add try/catch block to catch all errors but it doesn't show any errors and it appears that the file was created for the first while but NOPE, unfortunately.
Here is the code I'm using.
const axios = require("axios");
const faceapi = require("face-api.js");
const { FaceExpressions } = faceapi.nets;
const fs = require("fs");
async function trainModel(imageUrls) {
try {
await FaceExpressions.loadFromUri(process.cwd() + "/models");
const imageTensors = [];
for (let i = 0; i < imageUrls.length; i++) {
const response = await axios.get(imageUrls[i], {
responseType: "arraybuffer"
});
const image = new faceapi.Image();
image.constructor.loadFromBytes(new Uint8Array(response.data));
const imageTensor = faceapi.resizeImageToBase64Tensor(image);
imageTensors.push(imageTensor);
}
const model = await faceapi.trainFaceExpressions(imageTensors);
fs.writeFileSync("./models/face-expression-model.json", JSON.stringify(model), (err) => {
if (err) throw err;
console.log("The file has been saved!");
});
} catch (error) {
console.error(error);
}
}
const imageUrls = [
array of images urls here
];
trainModel(imageUrls);
I don't know exactly why but I had the same problem a while ago. Try using the "fs.writeFile" method. It worked for me.
fs.writeFile("models/face-expression-model.json", JSON.stringify(model), {}, (err) => {
if (err) throw err;
console.log("The file has been saved!");
});
Good luck with that!

Downloading and sending pdf document in Node through API

I am new to node, I want to download a pdf document from some another url when person hits a post request in the back-end, change the name of file and send the file back to original client where the pdf will be downloaded.
NOTE the file should not be saved in server
first there is controller file which contains following code
try {
const get_request: any = req.body;
const result = await printLabels(get_request,res);
res.contentType("application/pdf");
res.status(200).send(result);
} catch (error) {
const ret_data: errorResponse = await respondError(
error,"Something Went Wrong.",
);
res.status(200).json(ret_data);
}
Then after this the function printLabels is defined as
export const printLabels = async (request: any,response:any) => {
try {
const item_id = request.item_id;
let doc=await fs.createReadStream(`some url with ${item_id}`);
doc.pipe(fs.createWriteStream("Invoice_" + item_id + "_Labels.pdf"));
return doc;
} catch (error) {
throw error;
}
};
Using above code, I am getting error as no such file found. Also, I don't have access of front end so is it possible to test the API with postman for pdf which I am doing or my approach is incorrect?
Next solution working for Express, but I'm not sure if you're using Express-like framework. If that, please specify which framework you're using.
At first, you need to use sendFile instead of send:
try {
const get_request: any = req.body;
const result = await printLabels(get_request,res);
res.contentType("application/pdf");
res.status(200).sendFile(result);
} catch (error) {
const ret_data: errorResponse = await respondError(
error,"Something Went Wrong.",
);
res.status(200).json(ret_data);
}
Then, you returning readStream, instead of path to file. Notice, you need to use absolute path to do that.
const printLabels = async () => {
try {
let doc= await fs.createReadStream(path.join(__dirname, 'test.pdf'));
doc.pipe(fs.createWriteStream("Invoice_test_Labels.pdf"));
return path.join(__dirname, 'Invoice_test_Labels.pdf');
} catch (error) {
throw error;
}
};
About PostMan, of course you can see it or save it to file:

how can i store data in json file Continuous for discord js

I want to take the message information from the user and save it in a JSON file and this data is constantly added, but with the following code, this data is constantly replaced.
and I don't replace data I want to add data
this is my code :
const fs = require("fs");
const { Client, Intents } = require("discord.js");
const client = new Client({
intents: [Intents.FLAGS.GUILDS, Intents.FLAGS.GUILD_MESSAGES],
});
const now = new Date();
const obj = {
table: [],
};
let confirm = false;
const { badWords } = require("../badWordList.json");
client.on("message", async (message) => {
if (message.author.bot) return;
for (let i = 0; i < badWords.length; i++) {
if (message.content.toLowerCase().includes(badWords[i].toLowerCase()))
confirm = true;
}
if (confirm) {
obj.table.push({
name: message.author.username,
date: `${now.getFullYear()}/${now.getMonth()}/${now.getDate()}`,
message: message.channel.messages.channel.lastMessage.cleanContent,
});
fs.writeFile("myjsonfile.json", JSON.stringify(obj), function (err) {
if (err) throw err;
console.log("complete");
});
}
});
When using the fs.writeFile() it replaces the content of the file, as written in the docs.
At a first glance, you might want to use fs.write(), see the docs for usage.
But the NodeJS docs says :
It is unsafe to use fs.write() multiple times on the same file without waiting for the callback. For this scenario, fs.createWriteStream() is recommended.
Since you are in asynchronous mode, you shoud probably define a write stream to the file and then write to it, it gives you something like that :
// -snip-
const whateverJsonFileStream = fs.createWriteStream("myjsonfile.json");
client.on("message", async (message) => {
//-snip-
if (confirm) {
obj.table.push({
name: message.author.username,
date: `${now.getFullYear()}/${now.getMonth()}/${now.getDate()}`,
message: message.channel.messages.channel.lastMessage.cleanContent,
});
whateverJsonFileStream.write(JSON.stringify(obj), function (err) {
if (err) throw err;
console.log("complete");
});
}
});

How could I check If a zip file is corrupted in NodeJS?

I would check if a ZIP file is corrupted using NodeJS using less CPU and memory as possible.
How to corrupt a ZIP file:
Download a ZIP file
Open the ZIP file using a text editor optimized like Notepad++
Rewrite the header. Only put random characters.
I am trying to reach this goal using the NPM library "node-stream-zip"
private async assertZipFileIntegrity(path: string) {
try {
const zip = new StreamZip.async({ file: path });
const stm = await zip.stream(path);
stm.pipe(process.stdout);
stm.on('end', () => zip.close());
} catch (error) {
throw new Error();
}
}
However, when I run the unit tests I receive an error inside an array:
Rejected to value: [Error]
import zip from 'yauzl';
import path from 'path';
const invalidZipPath = path.resolve('invalid.zip');
const validZipPath = path.resolve('valid.zip');
const isValidZipFile = (filePath) => {
return zip.open(filePath, { lazyEntries: true }, (err, stream ) => {
if (err) {
console.log('fail to read ', filePath);
return false;
}
console.log('success read ', filePath);
return true;
});
}
isValidZipFile(validZipPath);
isValidZipFile(invalidZipPath);

JSON data not working using fs in javascript

I am trying to parse a json file and get some errors. It is in a directory under my js file with the fs in a folder called "recipes" with 3 json files all representing a seperate object.
Here's the json of all 3 that are similar:
{
"ingredients":
[
{"name":"Crab","unit":"Tsp","amount":3},
{"name":"Peas","unit":"Cup","amount":12},
{"name":"Basil","unit":"Tbsp","amount":10},
{"name":"Cumin","unit":"Liter","amount":3},
{"name":"Salt","unit":"Tbsp","amount":1}
],
"name":"Boiled Crab with Peas",
"preptime":"13",
"cooktime":"78",
"description":"A boring recipe using Crab and Peas",
"id":"b0e347d5-9428-48e5-a277-2ec114fc05a0"
}
My code is this: It gives an unexpected JSON position 1
fs.readdirSync("./recipes").forEach(file =>{
//let rec = JSON.parse(file);
console.log(JSON.parse(file))
})
readdirSync could return name string, binary, or dirent object. Neither is the file contents. The custom readFiles is what you need.
const fs = require('fs')
const path = require('path')
const ROOT_DIR = './contents'
const readFiles = (dir, cb) => {
try {
fs.readdirSync(dir).forEach(file =>{
fs.readFile(path.join(dir, file), 'utf-8', cb)
})
} catch (e) {
console.log(`Failed to open the directory ${e.path} `)
}
}
readFiles(ROOT_DIR, (err, data) => {
if (err) {
console.log(`Failed to read file: ${err.path}`)
}
console.log(JSON.parse(data))
})
You can try the below solution. I am not sure if I correctly passed the file path in readFile method, but it should work if the path is correct.
See this post
fs.readdirSync("./recipes").forEach(file => {
fs.readFile(file, 'utf8', (err, data) => { // not sure if file is the filePath
if (err) {
console.log(`Error reading file from disk: ${err}`);
} else {
// parse JSON string to JSON object
console.log(JSON.parse(data));
}
});
});

Categories

Resources