Promisfy gzip stream - javascript

I am trying to create a gzip stream which I can re-use elsewhere. Gzip works good to compress a stream of data but I wanted to refactor my code so I could seperate concerns. In doing so I ended up with the following. However, I can't utilised the returned stream.
const zlib = require('zlib');
function compress(stream) {
return new Promise((resolve, reject) => {
const gzipStream = zlib.createGzip();
gzipStream.on('end', () => {
resolve(gzipStream);
});
gzipStream.on('error', (e) => {
reject(e);
});
stream.pipe(gzipStream);
});
}
However I get empty output when I use the returned stream. E.g. I compress a 50mb file filled with 0's, and after using my new function, I get an empty file.
async function handler(req, res) {
const fileStream = fs.createReadStream("./test_50m");
const compressedStream = await compress(fileStream);
compressedStream.pipe(fs.createWriteStream("./test_50m_output"));
}
Thanks

Related

Downloading an mp3 file from S3 and manipulating it results in bad file

I did a script that downloads a MP3 file from my S3 bucket and then manipulates in before download (Adding ID3 Tags).
It's working and the tags are injected properly, but the files corrupts as it seems and unplayable.
I still can see my tags trough MP3tag so it has data in it, but no audio is playing trough the file.
Heres my code,
Trying to figure it what went wrong
const downloadFileWithID3 = async (filename, downloadName, injectedEmail) => {
try {
const data = await s3Client.send(
new GetObjectCommand({
Bucket: "BUCKETNAME",
Key: filename,
})
);
const fileStream = streamSaver.createWriteStream(downloadName);
const writer = fileStream.getWriter();
const reader = data.Body.getReader();
const pump = () =>
reader.read().then(({ value, done }) => {
if (done) writer.close();
else {
const arrayBuffer = value;
const writerID3 = new browserId3Writer(arrayBuffer);
const titleAndArtist = downloadName.split("-");
const [artist, title] = titleAndArtist;
writerID3.setFrame("TIT2", title.slice(0, -4));
writerID3.setFrame("TPE1", [artist]);
writerID3.setFrame("TCOM", [injectedEmail]);
writerID3.addTag();
let taggedFile = new Uint8Array(writerID3.arrayBuffer);
writer.write(taggedFile).then(pump);
}
});
await pump()
.then(() => console.log("Closed the stream, Done writing"))
.catch((err) => console.log(err));
} catch (err) {
console.log(err);
}
};
Hope you can help me solve this wierd bug,
Thanks in advance!
Ok so i've figured it out, instead of using chunks of the stream itself i've used getSignedUrl from the s3 bucket it works.
Thanks everyone for trying to help out!

Saving png file in disk in Node.js

I have the data from what I believe is a png file on a variable like this:
console.log(name, file.value);
How do I properly save that to the filesystem? I've tried all imaginable combinations to save it but they all return something wrong, so the image cannot be opened:
import fsp from 'node:fs/promises';
await fsp.write(name, file.value, "binary");
await fsp.write(name, file.value, "utf8");
await fsp.write(name, Buffer.from([file.value], name), "utf8");
await fsp.write(name, Buffer.from([file.value], name), "binary");
// ...
Many of them create the file, but it seems it's wrong and cannot be opened. To get the data here, the original file lives on /logo.png so I'm doing on the front-end:
fetch("/logo.png")
.then((res) => res.blob())
.then((blob) => {
const form = new FormData();
form.append("hello", "world");
// ...
form.append("icon.png", new File([blob], "icon.png"));
Then to get the data on the front-end I'm doing:
const getBody = async (req) => {
return await new Promise((done) => {
const buffers = [];
req.on("data", (chunk) => {
buffers.push(chunk);
});
req.on("end", () => {
done(Buffer.concat(buffers).toString("utf8"));
});
});
};
const rawData = await getBody(req);
// Parse the rawData to get the fields etc
So, my two questions are:
What is this data/format called? Binary? can I convert this data format ("Binary?" "etc?

How to decompress gziped AWS gateway api lambda response in react?

I followed this answer and successfully used gzip to compress the data and avoid AWS lambda 6MB response limitation. But I can't figure out how to decompress and convert to the string after the response is received in front end react app. My file is a log file.
I tried to solve:
// this is my “response.json()” will look like
const baseData = {
“data”: “H4sIAAAAA.....”
}
// decode the base64 encoded data
const gzipedData = Buffer.from(baseData.data, “base64");
const ungzip = async (input) => {
return new Promise((resolve, reject) =>
zlib.gzip(input, (err, data) => {
if (err) {
reject(err);
}
resolve(data);
})
);
};
// unzip and return a Buffer
const ungzipedData = await ungzip(gzipedData);
// convert Buffer to string
const buf = Buffer.from(ungzipedData, ‘utf8’);
console.log(buf.toString());
The result was something like this:
g#����r��.{�/)fx^�R�d�J%��y�c��P��...
I figured out just to use zlib.unzip and use util.promisify to return the final value as a promise.
If anyone knows any better solution (with pako maybe), please share, thank you!
import { Buffer } from 'buffer';
import zlib from "react-zlib-js";
import util from "util";
const getLog = async (itemName) => {
const response = await fetch(
"url?" +
new URLSearchParams({
some_param: "some_value",
})
);
if (!response.ok) {
throw new Error("Fail ....!");
}
const responseJson = await response.json();
const buffer = Buffer.from(responseJson.data, "base64");
const responseData = (await util.promisify(zlib.unzip)(buffer)).toString();
return responseData;
};

Why fs.readFileSync returns nothing inside a promise on serveside?

I found this "Rendering PDFs with React components" tutorial on themeteorchef about creating PDF files on Meteor serverside and then sending them back to client. I had no really need for PDF files, but docx files instead and thought that maybe I could follow similar approach when creating docx files with officegen
I created very similar server side module that generates a docx file from inputs on clientside and then tries to transform them into base64 string that is then supposed to be sent to the client. However, the base64 string is never created.
Here's the module:
let myModule;
const getBase64String = (loc) => {
try {
const file = fs.readFileSync(loc);
return new Buffer(file).toString('base64');
} catch (exception) {
myModule.reject(exception);
}
}
const generateBase64Docx = (path, fileName) => {
try {
myModule.resolve({fileName, base64: getBase64String(path+fileName)});
fs.unlink(loc);
} catch (exception) {
myModule.reject(exception);
}
}
const formatComponentAsDocx = (props, fileName) => {
try {
var docxFile = officegen({
'type': 'docx',
'orientation': 'portrait',
'title': props.title,
});
var pObj = docxFile.createP();
pObj.addText(props.body);
var path = './';
output = fs.createWriteStream(path+fileName);
docxFile.generate(output);
return path;
} catch (exception) {
myModule.reject(exception);
}
}
const handler = ({props, fileName}, promise) => {
myModule = promise;
const path = formatComponentAsDocx(props, fileName);
if (path) {
generateBase64Docx(path, fileName);
}
}
export const generateComponentAsDocx = (options) => {
return new Promise((resolve, reject) => {
return handler(options, { resolve, reject });
});
};
The problem here is the fs.readFileSync part. It always returns empty buffer and that's why the file is never transformed into base64 string and also never sent back to client. Why's that? The file itself is always created on the server and can always be found.
If I change the const file = fs.readFileSync(loc); part to for example this
fs.readFile(loc, (err, data) => {
if(err) myModule.reject(err);
console.log(JSON.stringify(data));
}
I can see some data in data, but not enough for the whole file.
What am I doing wrong here? Am I missing something?
You need to wait until the file generated by officegen is complete before you try to get the base64 out of it. That's the minimal change you need to make. I don't recommend waiting on the finalize event generated by officegen as this event is buggy. I recommend waiting on the finish event of the output stream. However, there are additional issues with the code you show:
Since you have code to unlink the file immediately after you use it, then I infer you do not need a file. So you can just create the data in memory and get a base64 string from that.
The whole rigmarole with myModule is awful awful design. If one of my colleagues presented such code, strong words would be exchanged. Yes, it is that bad. It is much better to convert the entire code base to work with promises.
The whole module can be reduced to the following. I've done a modicum of testing on this code but I don't claim that it deals with every eventuality.
import * as stream from "stream";
import officegen from "officegen";
function formatComponentAsDocx(props) {
return new Promise((resolve, reject) => {
// There's no need to wrap this in try...catch only to call reject. If any
// exception is raised in this function, the promise is automatically
// rejected.
const docxFile = officegen({
'type': 'docx',
'orientation': 'portrait',
'title': props.title,
});
const pObj = docxFile.createP();
pObj.addText(props.body);
// We record the output in our own buffer instead of writing to disk,
// and reading later.
let buf = Buffer.alloc(0);
const output = new stream.Writable({
write(chunk, encoding, callback) {
buf = Buffer.concat([buf, chunk]);
callback();
},
});
docxFile.generate(output, {
// Do propagate errors from officegen.
error: reject,
});
// We don't use the "finalize" event that docxFile.generate would emit
// because it is buggy. Instead, we wait for the output stream to emit
// the "finish" event.
output.on('finish', () => {
resolve(buf);
});
});
}
export function generateComponentAsDocx({ props }) {
return formatComponentAsDocx(props).then((data) => {
return { base64: data.toString("base64") };
});
};
Your problem is that docxFile.generate(output); is not synchronous. Thus, while your local path exists (it was created by fs.createWriteStream() call), it is empty and your synchronous fs.readFileSync is catching just that, empty file.
You should subscribe to docxFile's finalize event to catch file generation end:
docxFile.on('finalize, function (writtenBytes) {
// do you work with generated file here
});
Thus, rewriting your code:
const handler = ({props, fileName}, promise) => {
myModule = promise;
formatComponentAsDocx(props, fileName);
}
const formatComponentAsDocx = (props, fileName) => {
try {
var docxFile = officegen({
'type': 'docx',
'orientation': 'portrait',
'title': props.title,
});
var pObj = docxFile.createP();
pObj.addText(props.body);
var path = './';
output = fs.createWriteStream(path+fileName);
docxFile.on('error', function (err) {
myModule.reject(err);
});
docxFile.on('finalize', function () {
generateBase64Docx(path, fileName);
});
docxFile.generate(output);
} catch (exception) {
myModule.reject(exception);
}
}
readFileSync is synchronous, so it doesn't deal in promises.
https://nodejs.org/api/fs.html#fs_fs_readfilesync_file_options
Synchronous version of fs.readFile. Returns the contents of the file.
You probably want to use fs.readFile.
https://nodejs.org/api/fs.html#fs_fs_readfile_file_options_callback
The callback is passed two arguments (err, data), where data is the contents of the file.
If no encoding is specified, then the raw buffer is returned.

How can I return an archived readable stream in node.js without writing to filesystem?

I want to refactor my function to return a readable stream that I will pipe to http request module,
currently I'm returning the archived file location and creating a readstream from it:
const filepath = yield archive.archiveFilesAsTargz('path', 'name.tar.gz');
fs.createReadStream(filepath).pipe(request(options)).then(body =>{
console.log(body);
});
The flow I'm seeking is:
get a directory location as and archive it
get the archive as stream and return it (resolve it)
invoke the function and pipe the read stream to request
my function is as follows:
exports.archiveFilesAsTargz = function (dest, archivedName) {
return new Promise((resolve, reject)=> {
const name = slugify(archivedName);
const filePath = path.join(dest, name + '.tar.gz');
const output = fs.createWriteStream(filePath);
const archive = archiver('tar', {
gzip: true
});
archive.pipe(output);
archive.directory(dest, name).finalize();
output.on('close', ()=> resolve(filePath));
archive.on('error' ,(err) => reject(err));
});
};
OK so after another reading session and plays I solved it...
function archiveFilesAsTargz (dest, name) {
const archive = archiver('tar', {
gzip: true
});
return archive.directory(dest, name).finalize();
}
the following will return a readstream :
archive.directory(dest, name).finalize();
so using it a follows worked great for me
const pack = archiveFilesAsTargz(zippath, 'liron');
pack.pipe(request(options)).then(body =>{
console.log(body);
})
.catch(err => {
console.log(err);
});

Categories

Resources