How to assign a buffer to createReadStream - javascript

According to official document, createReadStream can accept a buffer type as path argument.
But many Q&A only offer solutions of how to send argument by string, not buffer.
How do I set a properly buffer argument to meet path of createReadStream?
This is my code:
fs.access(filePath, (err: NodeJS.ErrnoException) => {
// Response with 404
if (Boolean(err)) { res.writeHead(404); res.end('Page not Found!'); return; }
// Create read stream accord to cache or path
let hadCached = Boolean(cache[filePath]);
if (hadCached) console.log(cache[filePath].content)
let readStream = hadCached
? fs.createReadStream(cache[filePath].content, { encoding: 'utf8' })
: fs.createReadStream(filePath);
readStream.once('open', () => {
let headers = { 'Content-type': mimeTypes[path.extname(lookup)] };
res.writeHead(200, headers);
readStream.pipe(res);
}).once('error', (err) => {
console.log(err);
res.writeHead(500);
res.end('Server Error!');
});
// Suppose it hadn't cache, there is a `data` listener to store the buffer in cache
if (!hadCached) {
fs.stat(filePath, (err, stats) => {
let bufferOffset = 0;
cache[filePath] = { content: Buffer.alloc(stats.size, undefined, 'utf8') }; // Deprecated: new Buffer
readStream.on('data', function(chunk: Buffer) {
chunk.copy(cache[filePath].content, bufferOffset);
bufferOffset += chunk.length;
//console.log(cache[filePath].content)
});
});
}
});
```

Use the PassThrough method from the inbuild stream library:
const stream = require("stream");
let readStream = new stream.PassThrough();
readStream.end(new Buffer('Test data.'));
// You now have the stream in readStream
readStream.once("open", () => {
// etc
});

Related

Decode a Uint8Array into a JSON

I am fetching data from an API in order to show sales and finance reports, but I receive a type gzip file which I managed to convert into a Uint8Array. I'd like to somehow parse-decode this into a JSON file that I can use to access data and create charts in my frontend with.
I was trying with different libraries (pako and cborg seemed to be the ones with the closest use cases), but I ultimately get an error Error: CBOR decode error: unexpected character at position 0
This is the code as I have it so far:
let req = https.request(options, function (res) {
console.log("Header: " + JSON.stringify(res.headers));
res.setEncoding("utf8");
res.on("data", function (body) {
const deflatedBody = pako.deflate(body);
console.log("DEFLATED DATA -----> ", typeof deflatedBody, deflatedBody);
console.log(decode(deflatedBody));
});
res.on("error", function (error) {
console.log("connection could not be made " + error.message);
});
});
req.end();
};
I hope anyone has stumbled upon this already and has some idea.
Thanks a lot!
Please visit this answer https://stackoverflow.com/a/12776856/16315663 to retrieve GZIP data from the response.
Assuming, You have already retrieved full data as UInt8Array.
You just need the UInt8Array as String
const jsonString = Buffer.from(dataAsU8Array).toString('utf8')
const parsedData = JSON.parse(jsonString)
console.log(parsedData)
Edit
Here is what worked for me
const {request} = require("https")
const zlib = require("zlib")
const parseGzip = (gzipBuffer) => new Promise((resolve, reject) =>{
zlib.gunzip(gzipBuffer, (err, buffer) => {
if (err) {
reject(err)
return
}
resolve(buffer)
})
})
const fetchJson = (url) => new Promise((resolve, reject) => {
const r = request(url)
r.on("response", (response) => {
if (response.statusCode !== 200) {
reject(new Error(`${response.statusCode} ${response.statusMessage}`))
return
}
const responseBufferChunks = []
response.on("data", (data) => {
console.log(data.length);
responseBufferChunks.push(data)
})
response.on("end", async () => {
const responseBuffer = Buffer.concat(responseBufferChunks)
const unzippedBuffer = await parseGzip(responseBuffer)
resolve(JSON.parse(unzippedBuffer.toString()))
})
})
r.end()
})
fetchJson("https://wiki.mozilla.org/images/f/ff/Example.json.gz")
.then((result) => {
console.log(result)
})
.catch((e) => {
console.log(e)
})
Thank you, I actually just tried this approach and I get the following error:
SyntaxError: JSON Parse error: Unexpected identifier "x"
But I managed to print the data in text format using the below function:
getFinancialReports = (options, callback) => {
// buffer to store the streamed decompression
var buffer = [];
https
.get(options, function (res) {
// pipe the response into the gunzip to decompress
var gunzip = zlib.createGunzip();
res.pipe(gunzip);
gunzip
.on("data", function (data) {
// decompression chunk ready, add it to the buffer
buffer.push(data.toString());
})
.on("end", function () {
// response and decompression complete, join the buffer and return
callback(null, buffer.join(""));
})
.on("error", function (e) {
callback(e);
});
})
.on("error", function (e) {
callback(e);
});
};
Now I would need to pass this into a JSON object.

Azure function don't accept to create file on remote

I would download file on local the create a stream then send to an API.
In localhost files get created via blobClient.downloadToFile(defaultFile);
But When I deploy function it can not find file to stream, so I think that the download does not happen or in bad location.
I get this error
[Error: ENOENT: no such file or directory, open 'D:\home\site\wwwroot\importPbix\exampleName.pbix'
Here's my code
const blobServiceClient = BlobServiceClient.fromConnectionString(
process.env.CONNEXION_STRING
);
const containerClient = blobServiceClient.getContainerClient(
params.containerName
);
const blobClient = containerClient.getBlobClient(process.env.FILE_LOCATION); // get file from storage
let blobData;
var defaultFile = path.join(params.baseDir, `${params.reportName}.pbix`); // use path module
let stream;
try {
blobData = await blobClient.downloadToFile(defaultFile);
console.log(blobData);
stream = fs.createReadStream(defaultFile);
} catch (error) {
params.context.log(error);
console.log(error);
}
var options = {
method: "POST",
url: `https://api.powerbi.com/v1.0/myorg/groups/${params.groupId}/imports?datasetDisplayName=${params.reportName}`,
headers: {
"Content-Type": "multipart/form-data",
Authorization: `Bearer ${params.accessToken} `,
},
formData: {
"": {
value: stream,
options: {
filename: `${params.reportName}.pbix`,
contentType: null,
},
},
},
};
//check if file keep in mem
return new Promise(function (resolve, reject) {
request(options, function (error, response) {
if (error) {
params.context.log(error);
reject(error);
} else {
params.context.log(response);
resolve(response.body);
}
fs.unlinkSync(defaultFile);
});
});
I found this post having same issue , that's why I user path module and passed __dirname to function params.baseDir.
If you want to download a file from Azure blob and read it as a stream, just try the code below, in this demo, I try to download a .txt file to a temp folder(you should create it first on Azure function)and print its content from the stream for a quick test:
module.exports = async function (context, req) {
const { BlockBlobClient } = require("#azure/storage-blob")
const fs = require('fs')
const connStr = '<connection string>'
const container = 'files'
const blobName = 'test.txt'
const tempPath = 'd:/home/temp/'
const tempFilePath = tempPath + blobName
const blobClient = new BlockBlobClient(connStr,container,blobName);
await blobClient.downloadToFile(tempFilePath).then(async function(){
context.log("download successfully")
let stream = fs.createReadStream(tempFilePath)
//Print text content,just check if stream has been readed successfully
context.log("text file content:")
context.log(await streamToString(stream))
//You can call your API here...
})
function streamToString (stream) {
const chunks = [];
return new Promise((resolve, reject) => {
stream.on('data', (chunk) => chunks.push(Buffer.from(chunk)));
stream.on('error', (err) => reject(err));
stream.on('end', () => resolve(Buffer.concat(chunks).toString('utf8')));
})
}
context.res = {
body: 'done'
}
}
Result
File has been downloaded:
read as stream successfully:

send text data with formdata via Axios post

I am sending over a PDF file in formdata with an Axios post request. So that file will get uploaded/saved to a folder on the server. I'm using multer on the server to save the file. And that works great.
Now I also want to add some fields to the DB related to the file. One is the generated file name that gets generated right before the file is saved. So I don't want to make a round trip back to the client and then make another call out to the server to update the DB. So I want a send a few text strings along with the formdata. But no matter what I try, I cannot read any text data from that formdata object in my Node code. FYI I am using Express on my Node server.
Client side code that kicks off the upload process: (notice I am attempting to append additional fields to the formdata object)
const uploadFilesAsync = () => {
const data = new FormData();
const filenames = [];
uploadedFiles.forEach((f) => {
filenames.push(f.name);
data.append('file', f);
});
const fileInfo = {
customer: selectedCustomer,
load: selectedLoad,
filenames
};
data.append('customer', selectedCustomer);
data.append('load', selectedLoad);
data.append('filenames', filenames.toString());
// I also tried the following and then passing fileInfo in with data and setLoaded
const fileInfo = {customer: selectedCustomer, load: selectedLoad,
filenames: filenames.toString()};
uploadFiles(data, setLoaded)
.then((res) => {
console.log('uploadFiles res: ', res);
if (res.status === 200) {
// addFileInfoToDB(fileInfo)
// .then((r) => console.log('addFileInfoToDB: ', r))
// .catch((e) => console.log({ e }));
}
})
.catch((e) => console.log(e));
};
And then the client side function uploadFiles:
export const uploadFiles = (data, setLoaded) => {
console.log({ data });
const config = {
onUploadProgress: function(progressEvent) {
const percentCompleted = Math.round(
(progressEvent.loaded * 100) / progressEvent.total
);
setLoaded(percentCompleted);
},
headers: {
'Content-Type': 'multipart/form-data'
}
};
// and then here if I passed in the fileInfo object, I tried sending `{data, fileInfo}`
// instead of just data, but that broke file upload portion too
return axios
.post(baseURL + '/SDS/upload', data, config)
.then((response) => {
console.log({ response });
return response;
})
.catch((e) => {
return Promise.reject(e);
});
};
And finally the server side function that does all the work:
static async uploadSDS(req, res) {
console.log(req.body);
let uploadSuccess = false;
upload(req, res, async function(err) {
if (err instanceof multer.MulterError) {
// return res.status(500).json({ Error1: err });
//return { status: 500 };
} else if (err) {
// return res.status(500).json({ Error2: err });
//return { status: 500 };
} else {
uploadSuccess = true;
}
console.log(uploadSuccess);
// return res.status(200).send(req.file);
//return { status: 200 };
// if (uploadSuccess) {
// try {
// const result = await SDS.addFileInfoToDB(req.fileInfo);
// if (result) {
// return res.status(200).json({ result });
// }
// } catch (e) {
// console.log(e);
// }
// }
});
}
When I console.log the req.body it is always empty.

Read pdf file with node JS

I am trying to read pdf file fro url as follows
const axios = require("axios");
const jsdom = require("jsdom");
const PdfReader = require('pdfreader').PdfReader;
const { JSDOM } = jsdom;
axios.get("https://url-to-pdf.pdf").then(function(result) {
new PdfReader().parseBuffer(result.data, function(err, item) {
if (err)
console.log(err);
else if (item.text)
console.log(item.text);
});
}).catch(function(err) {
});
It shows
An error occurred while parsing the PDF: stream must have data
{
parserError: 'An error occurred while parsing the PDF: stream must have data'
}
How to solve this issue.
The key point is to ask a responseType of arraybuffer but then you have to transform it into a Buffer.
try {
var options = {
method: 'get',
url: url,
headers: { 'User-Agent': 'PostmanRuntime/7.26.8' },
timeout: 2000,
responseEncoding: 'utf8',
maxRedirects: 5,
httpAgent: new http.Agent({ keepAlive: true }),
responseType: 'arraybuffer'
}
let response = await axios(options);
console.log(response.status);
console.log(response.headers['content-type']);
if (response.headers['content-type'].indexOf('pdf') != -1) {
console.log("pdf");
console.log(typeof response.data);
var buff = new Buffer.alloc(0);
buff = Buffer.concat([buff, response.data]);
temp = await extract_pdf.readlines(buff).catch(function (o) { console.log(o); return; });
console.log(temp);
}
} catch (error) {
if (error.response) {
}
else {
console.log(error);
}
}

Download zip file being sent by server on client side?

I have an API that downloads multiple files from AWS S3, creates a zip which is saved to disk, and sends that zip back to the client. The API works, but I have no idea how to handle the response / download the zip to disk on the client side.
This is my API:
reports.get('/downloadMultipleReports/:fileKeys', async (req, res) => {
var s3 = new AWS.S3();
var archiver = require('archiver');
const { promisify } = require('util');
var str_array = req.params.fileKeys.split(',');
console.log('str_array: ',str_array);
for (var i = 0; i < str_array.length; i++) {
var filename = str_array[i].trim();
var filename = str_array[i];
var localFileName = './temp/' + filename.substring(filename.indexOf("/") + 1);
console.log('FILE KEY >>>>>> : ', filename);
const params = { Bucket: config.reportBucket, Key: filename };
const data = await (s3.getObject(params)).promise();
const writeFile = promisify(fs.writeFile);
await writeFile(localFileName, data.Body);
}
// create a file to stream archive data to.
var output = fs.createWriteStream('reportFiles.zip');
var archive = archiver('zip', {
zlib: { level: 9 } // Sets the compression level.
});
// listen for all archive data to be written
// 'close' event is fired only when a file descriptor is involved
output.on('close', function() {
console.log(archive.pointer() + ' total bytes');
console.log('archiver has been finalized and the output file descriptor has closed.');
});
// This event is fired when the data source is drained no matter what was the data source.
// It is not part of this library but rather from the NodeJS Stream API.
// #see: https://nodejs.org/api/stream.html#stream_event_end
output.on('end', function() {
console.log('Data has been drained');
});
// good practice to catch warnings (ie stat failures and other non-blocking errors)
archive.on('warning', function(err) {
if (err.code === 'ENOENT') {
// log warning
} else {
// throw error
throw err;
}
});
// good practice to catch this error explicitly
archive.on('error', function(err) {
throw err;
});
// pipe archive data to the file
archive.pipe(output);
// append files from a sub-directory, putting its contents at the root of archive
archive.directory('./temp', false);
// finalize the archive (ie we are done appending files but streams have to finish yet)
// 'close', 'end' or 'finish' may be fired right after calling this method so register to them beforehand
archive.finalize();
output.on('finish', () => {
console.log('Ding! - Zip is done!');
const zipFilePath = "./reportFiles.zip" // or any file format
// res.setHeader('Content-Type', 'application/zip');
fs.exists(zipFilePath, function(exists){
if (exists) {
res.writeHead(200, {
"Content-Type": "application/octet-stream",
"Content-Disposition": "attachment; filename=" + "./reportFiles.zip"
});
fs.createReadStream(zipFilePath).pipe(res);
} else {
response.writeHead(400, {"Content-Type": "text/plain"});
response.end("ERROR File does not exist");
}
});
});
return;
});
And this is how I am calling the API / expecting to download the response:
downloadMultipleReports(){
var fileKeysString = this.state.awsFileKeys.toString();
var newFileKeys = fileKeysString.replace(/ /g, '%20').replace(/\//g, '%2F');
fetch(config.api.urlFor('downloadMultipleReports', { fileKeys: newFileKeys }))
.then((response) => response.body())
this.closeModal();
}
How can I handle the response / download the zip to disk?
This is what ended up working for me:
Server side:
const zipFilePath = "./reportFiles.zip";
fs.exists(zipFilePath, function(exists){
if (exists) {
res.writeHead(200, {
"Content-Type": "application/zip",
"Content-Disposition": "attachment; filename=" + "./reportFiles.zip"
});
fs.createReadStream(zipFilePath).pipe(res);
} else {
response.writeHead(400, {"Content-Type": "text/plain"});
response.end("ERROR File does not exist");
}
});
Client side:
downloadMultipleReports(){
var fileKeysString = this.state.awsFileKeys.toString();
var newFileKeys = fileKeysString.replace(/ /g, '%20').replace(/\//g, '%2F');
fetch(config.api.urlFor('downloadMultipleReports', { fileKeys: newFileKeys }))
.then((res) => {return res.blob()})
.then(blob =>{
download(blob, 'reportFiles.zip', 'application/zip');
this.setState({isOpen: false});
})
}

Categories

Resources