create blob from file path vue electron bulider - javascript

I'm using the fs module in my electron app to read file content from path
ipcMain.on('fileData', (event, data) => {
data.forEach( (file) => {
const stream = fs.createReadStream(file)
stream.on('data', (buffer) => {
console.log(buffer)
})
})
})
I'm able to open the files but I get a buffer. what I want is to create blob from the files to do some process on them. How I can achive this in electron?

If you're trying to create a Blob in the main process, i.e. the NodeJS environment, keep in mind that NodeJS has no support for Blobs.
If you're trying to create a Blob in the renderer process from a file, though, you can use a preloader or enable nodeIntegration. Then you can use something like the following:
const fs = require('fs');
const stream = fs.createReadStream(filepath);
var blob = new Blob([]); // empty blob
stream.on('data', (buffer) => {
blob = new Blob([blob, buffer]); // concatenate buffer
});
stream.on('close', () => {
// blob is ready!
});

Related

Use new File() on Nodejs

I have to send static images to another Node app
To do this I need to get Base64 from file
This is the function I used in another project (VueJS web app):
export async function getBase64FromURL (path, filename) {
const fr = new FileReader()
fr.readAdDataURL(new File(path, filename))
return await new Promise((resolve, reject) => {
fr.onloadend = () => {
resolve(fr.result)
}
})
}
NodeJS lacks some functions, for example FileReader() and I found this npm package
But I haven't found a solution for new File(), what can I do?
// Import "fs" lib.
const fs = require('fs');
// Read file as buffer and convert to BASE64.
const fileBase64 = fs.readFileSync('./file.txt').toString('base64');

Promisfy gzip stream

I am trying to create a gzip stream which I can re-use elsewhere. Gzip works good to compress a stream of data but I wanted to refactor my code so I could seperate concerns. In doing so I ended up with the following. However, I can't utilised the returned stream.
const zlib = require('zlib');
function compress(stream) {
return new Promise((resolve, reject) => {
const gzipStream = zlib.createGzip();
gzipStream.on('end', () => {
resolve(gzipStream);
});
gzipStream.on('error', (e) => {
reject(e);
});
stream.pipe(gzipStream);
});
}
However I get empty output when I use the returned stream. E.g. I compress a 50mb file filled with 0's, and after using my new function, I get an empty file.
async function handler(req, res) {
const fileStream = fs.createReadStream("./test_50m");
const compressedStream = await compress(fileStream);
compressedStream.pipe(fs.createWriteStream("./test_50m_output"));
}
Thanks

Encode and decode javascript file or blob to base64

I am creating a web app with react as frontend and python ariadne (graphql) as backend. I want to allow the user to upload a file.
Essentially, I first want to convert the file to base64 in react, pass it to graphql mutation, and then decode the base64 string back to file in javascript.
Just like base64.b64encode & base64.b64decode in python.
Is there a way to this with a javascript file or blob object?
You can convert a file but note that it's an async call:
const toBase64 = file => new Promise((resolve, reject) => {
const r = new FileReader();
r.readAsDataURL(file);
r.onerror = error => reject(error);
r.onload = () => resolve(reader.result);
});
// use with await toBase64(YOUR_FILE);
To reconvert the base64 to a file use this:
fetch(YOUR_DATA)
.then(res => res.blob())
.then(blob => {
const file = new File([blob], "YOUR_FILE_NAME" ,{ type: YOUR_MIME_TYPE })
})

Event listener for when a file has finished streaming to AWS S3 upload api?

I am creating a file backup between Google Drive and AWS S3. Where I create a Readable stream promise by downloading the file using the Google Get API and Pipping the data to AWS S3.
As I have many files, each promise is added to a queue and only new promises enter when it resolves.
I'm struggling to only resolve the promise when the file has completed upload to AWS S3, rather than when the file has downloaded?
I thought using .on('finish', () => {resolve()}) should do this but it doesn't seem to be working.
Here is my code sample:
// download stream of NON gdocs files and pipe to destination
const getGFileContent = async (fileObj) => {
let fileExt = fileObj.path.join('/').concat('/',fileObj.name)
return drive.files.get({fileId: fileObj.id, mimeType: fileObj.mimeType, alt: 'media'}, {responseType: 'stream'})
.then(res => {
return new Promise((resolve, reject) => {
res.data
.pipe(uploadS3(fileExt))
.on('end', () => {console.log(`Done downloading file: ${fileExt}`)})
.on('finish', () => {resolve(console.log(`File Backup Complete: ${fileExt}`))})
.on('error', err => {reject(console.error(`Error downloading file: ${err}`))})
})
// upload a file to AWS S3 by passing the file stream from getGFileContent into the 'body' parameter of the upload
const uploadS3 = (filePath) => {
let pass = new stream.PassThrough()
let params = {
Bucket: awsBucketName, // bucket-name
Key: filePath, // file will be saved as bucket-name/[uniquekey.csv]
Body: pass // file data passed through stream
}
new aws.S3().upload(params).promise()
.then(() => console.log(`Successfully uploaded to S3: ${filePath}`))
.catch( err => console.log(`Error, unable to upload to S3: ${err}`))
return pass
}
The first thing that comes to mind is to make uploadS3 function async and await for the upload to finish, before returning the passThrough stream. But this wouldn't work. It would then return a Promise and the .pipe() accepts only a stream object.
Instead of that, you could refactor your code so that getGFileContent would return a readable stream promise.
Then, make uploadS3 accept a readable stream as a parameter and return an s3 upload promise.
To wrap it up, add an async backupFile function, which will await for both GDrive steam and upload promises to be resolved before continuing. This will also keep the functions tidy and clean, each having its own single responsibility.
Example code:
const AWS = require('aws-sdk');
const fs = require('fs');
const s3 = new AWS.S3();
AWS.config.update({
accessKeyId: '----',
secretAccessKey: '----',
});
const backupFile = async (file) => {
const fileStream = await getGFileStream(file);
try {
await uploadStreamToS3(fileStream);
console.log(`S3 Backup of ${fileStream.path} completed`)
} catch (err) {
console.log(`error during file upload ${err}`);
}
}
const getGFileStream = async (fileObj) => {
// TODO: logic to find and get the file. Returns a readableStream promise
const fileStream = fs.createReadStream('./largeFile.zip');
console.log('File ${...} read from Google Drive');
return fileStream;
}
const uploadStreamToS3 = (fileStream) => {
const params = {Bucket: 'test-bucket', Key: 'key', Body: fileStream}
console.log(`Starting to upload ${fileStream.path} to S3`);
return s3.upload(params).promise();
}
backupFile({id: 'mockTestFile'});

How to zip files PDF from a Storage in NodeJS

I need to create a zip file with any PDF what I recieved from Storage AWS, and I am trying do this with ADM-zip in NodeJS, but i cant read the final file.zip.
Here is the code.
var zip = new AdmZip();
// add file directly
var content = data.Body.buffer;
zip.addFile("test.pdf", content, "entry comment goes here");
// console.log(content)
// add local file
zip.addLocalFile(`./tmp/boletos/doc.pdf`);
// // get everything as a buffer
var willSendthis = zip.toBuffer();
console.log(willSendthis)
// // or write everything to disk
zip.writeZip("test.zip", `../../../tmp/boletos/${datastring}.zip`);
As it is this only creates a .zip for each file..zip
I was also facing this issue. I looked through a lot of SO posts. This is how I was able to create a zip with multiple files from download urls. Please keep in mind, I'm unsure this is best practice, or if this is going to blow up memory.
Create a zip folder from a list of id's of requested resources via the client.
const zip = new AdmZip();
await Promise.all(sheetIds.map(async (sheetId) => {
const downloadUrl = await this.downloadSds({ sheetId, userId, memberId });
if (downloadUrl) {
await new Promise((resolve) => https.get(downloadUrl, (res) => {
const data = [];
res.on('data', (chunk) => {
data.push(chunk);
}).on('end', () => {
const buffer = Buffer.concat(data);
resolve(zip.addFile(`${sheetId}.pdf`, buffer));
});
}));
} else {
console.log('could not download');
}
}));
const zipFile = zip.toBuffer();
I then used downloadjs in my React.js client to download.
const result = await sds.bulkDownloadSds(payload);
if (result.status > 399) return store.rejectWithValue({ errorMessage: result?.message || 'Error', redirect: result.redirect });
const filename = 'test1.zip';
const document = await result.blob();
download(document, filename, 'zip');

Categories

Resources