Send binary response from UInt8Array in Express.js - javascript

I am using Express.js with Typescript and I would like to send a UInt8Array as binary data.
This is what I use so far and it works, but I would like not to save the file before, because I think it wastes performance:
const filePath = path.resolve(__dirname, 'template.docx');
const template = fs.readFileSync(filePath);
const buffer: Uint8Array = await createReport({
template,
data: {
productCode: data.productCode,
},
});
fs.writeFileSync(path.resolve(__dirname, 'output.docx'), buffer);
res.sendFile(path.resolve(__dirname, 'output.docx'));
I am using docx-templates to generate the file by the way.

You can use a PassThrough stream for this purpose, it'll keep the file in memory with no need to write to disk.
Something like this should do it:
const stream = require("stream");
const readStream = new stream.PassThrough();
// Pass your output.docx buffer to this
readStream.end(buffer);
res.set("Content-disposition", 'attachment; filename=' + "output.docx");
res.set("Content-Type", "application/vnd.openxmlformats-officedocument.wordprocessingml.document");
readStream.pipe(res);
The complete node.js code:
const fs = require("fs");
const express = require("express");
const port = 8000;
const app = express();
const stream = require("stream");
app.get('/download-file', (req, res) => {
const buffer = fs.readFileSync("./test.docx");
console.log("/download-file: Buffer length:", buffer.length);
const readStream = new stream.PassThrough();
readStream.end(buffer);
res.set("Content-disposition", 'attachment; filename=' + "test.docx");
res.set("Content-Type", "application/vnd.openxmlformats-officedocument.wordprocessingml.document");
readStream.pipe(res);
});
app.listen(port);
console.log(`Serving at http://localhost:${port}`);
To test, add a 'test.docx' file to the same directory, then point your browser to http://localhost:8000/download-file

Terry,
Thanks for the update of your answer and providing the full code. However, it still does not help much. I am trying to understand how I can handle this on the front-end side, in my case in Vue. Here is the following code:
router.post('/chart/word', async (req, res, next) => {
try {
if (!req.body.chartImage) throw new BadRequest('Missing the chart image from the request body')
const wordTemplate = await s3GetFile('folder', 'chart-templates-export/charts-template.docx')
const template = wordTemplate.Body
const buffer = await createReport({
cmdDelimiter: ["{", "}"],
template,
additionalJsContext: {
chart: () => {
const dataUrl = req.body.chartImage.src
const data = dataUrl.slice("data:image/jpeg;base64,".length);
return { width: 18 , height: 12, data, extension: '.jpeg' }
}
}
})
const stream = require('stream')
const readStream = new stream.PassThrough()
readStream.end(buffer)
res.set("Content-disposition", 'attachment; filename=' + "output.docx")
res.set("Content-Type", "application/vnd.openxmlformats-officedocument.wordprocessingml.document")
readStream.pipe(res)
} catch (err) {
console.log(err)
next(err)
}
})
And here is my Vue code, tested various stuff, but nothing...:
async exportCharts() {
console.log('this.$refs.test: ', this.$refs.test)
let img = {
src: this.$refs.test.getDataURL({
type: 'jpeg',
pixelRatio: window.devicePixelRatio || 1,
backgroundColor: '#fff'
}),
width: this.$refs.test.getWidth(),
height: this.$refs.test.getHeight()
}
const answersReq = await this.axios({
method: 'post',
url: '/pollAnswers/chart/word',
data: {
chartImage: img
}
responseType: 'arraybuffer' // 'blob' // 'document'
})
console.log('answersReq: ', answersReq)
if (answersReq.data) {
downloadURL(answersReq.data, 'report.docx')
}
}
What I am basically doing is: sending an image to the API (taken from html vue-echart element), then inserting it in a docx template, by using docx-templates library, which returns me Uint8Array that I want to export as the new Word Document with the populated charts. Then, the user (on the UI) should be able to choose the destination.
Here is the code for the download URL:
export function downloadURL(data, fileName) {
const mimeType = 'application/vnd.openxmlformats-officedocument.wordprocessingml.document'
const blob = new Blob([data], { type: mimeType })
const url = URL.createObjectURL(blob)
const element = document.createElement('a')
element.href = url
element.download = fileName
element.style.display = 'none'
document.body.appendChild(element)
element.click()
URL.revokeObjectURL(element.href)
document.body.removeChild(element)
}
P.S. Just to mention, if I directly save the buffer (the Uint8Array returned from the createReport) in the API, it works, the file is downloaded successfully and I can read it without any problems - it populates the correct chart in the file.
UPDATE:
I figured that out, but I am not sure why this is necessary and why it works that way and not the other. So, in the /chart/word endpoint, I am converting the Uint8Array buffer into a stream, then passing it as a response (the same way you used). Afterwards, in the Vue, I fetched this as responseType: 'arraybuffer', which converted the stream response into Uint8Array buffer again, then, I used the same method for the download and it works. Initially, I tried to send directly the buffer (without converting it as stream as you mentioned), but then on the front-end, the response was received as object that contained the Uint8Array buffer values, which was not what is expected and I could not create legit docx file. So, for some reason, it is required to convert the buffer as stream in the API, before sending it as response. Afterwards, on the front-end, I have to convert it back to arraybuffer and, finally, to make the docx download.
If you can explain to me why it works like that, I will be very happy.

Related

express.js: pass an uploaded image to s3

I am trying to pass an image uploaded from a react app through express to a managed s3 bucket. The platform/host I am using creates and manages the s3 bucket and generates upload and access urls. This all works fine (I have tested a generated upload url in postman with an image in a binary body and it worked perfectly).
My problem is passing the image through express. I am using multer to get the image from the form but I am assuming multer is turning that image into some kind of file object and s3 is expecting some sort of blob or stream.
In following code, the image in req.file exists, I get a 200 response from s3 with no errors and when I visit the asset url the url works, but the image itself is missing.
const router = Router();
const upload = multer()
router.post('/', upload.single('file'), async (req, res) => {
console.log(req.file)
const asset = req.file
const assetPath = req.headers['asset-path']
let s3URLs = await getPresignedURLS(assetPath)
const sendAsset = await fetch(
s3URLs.urls[0].upload_url, // the s3 upload url
{
method: 'PUT',
headers: {
"Content-Type": asset.mimetype
},
body: asset,
redirect: 'follow'
}
)
console.log("s3 response", sendAsset)
res.status(200).json({"url": s3URLs.urls[0].access_url });
});
export default router;
I am not sure what to do to convert what multer gives me to something that aws s3 will accept. I am also open to getting rid of multer if there is an easier way to upload binary files to express.
Instead of multer, you can use multiparty to get file data from request object. And to upload file to s3 bucket you can use aws-sdk.
const AWS = require("aws-sdk");
const multiparty = require("multiparty");
/**
* Helper method which takes the request object and returns a promise with a data.
*/
const getDataFromRequest = (req) =>
new Promise(async(resolve, reject) => {
const form = new multiparty.Form();
await form.parse(req, (err, fields, files) => {
if (err) reject(err);
const bucketname = fields.bucketname[0];
const subfoldername = fields.subfoldername[0];
const file = files["file"][0]; // get the file from the returned files object
if (!file) reject("File was not found in form data.");
else resolve({
file,
bucketname,
subfoldername
});
});
});
/**
* Helper method which takes the request object and returns a promise with the AWS S3 object details.
*/
const uploadFileToS3Bucket = (
file,
bucketname,
subfoldername,
options = {}
) => {
const s3 = new AWS.S3();
// turn the file into a buffer for uploading
const buffer = readFileSync(file.path);
var originalname = file.originalFilename;
var attach_split = originalname.split(".");
var name = attach_split[0];
// generate a new random file name
const fileName = name;
// the extension of your file
const extension = extname(file.path);
console.log(`${fileName}${extension}`);
const params = {
Bucket: bucketname, //Bucketname
ACL: "private", //Permission
Key: join(`${subfoldername}/`, `${fileName}${extension}`), // File name you want to save as in S3
Body: buffer, // Content of file
};
// return a promise
return new Promise((resolve, reject) => {
return s3.upload(params, (err, result) => {
if (err) reject(err);
else resolve(result); // return the values of the successful AWS S3 request
});
});
};
router.post('/', upload.single('file'), async(req, res) => {
try {
// extract the file from the request object
const {
file,
bucketname,
subfoldername
} = await getDataFromRequest(req);
// Upload File to specified bucket
const {
Location,
ETag,
Bucket,
Key
} = await uploadFileToS3Bucket(
file,
bucketname,
subfoldername
);
let response = {};
res["Location"] = Location;
response["ETag"] = ETag;
response["Bucket"] = Bucket;
response["Key"] = Key;
res.status(200).json(response);
} catch (error) {
throw error;
}
});
Request body will be form data with following fields
bucketname:
subfoldername:
file: FileData
For anyone that ever stumbles across this question the solution was to create an custom multer storage engine. Inside the engine you get access to the file with a stream property that s3 accepted (with the correct headers).

Upload csv file to aws s3 bucket directly from a server

Happy weekend all,
I'm working on a task that fetches data from an API then store them into a csv file then from there directly upload to AWS S3 bucket. I've tried several ways but I'm currently stuck at the very last point. Any help would be much appreciate
My code below would demonstrate most of the problems and also what I've been trying so far.
First, I will fetch the data from an API
async systems() {
const endpoint = sampleEndPoints.SYSTEMS
return this.aggregateEndpoint(endpoint)
}
Second, I will get the data that fetched back and put them in a csv file as buffer. (Because I have to store them in fs.createReadStream later on)
// generate JSON to Buffer
async generateCsvToBuffer(json){
const {aws} = this.config
var ws = xlsx.utils.json_to_sheet(json)
var wb = xlsx.utils.book_new();
await xlsx.utils.book_append_sheet(wb, ws, 'Systems')
const csvParsed = xlsx.write(wb, { type: 'buffer'})
return csvParsed;
}
Third, I get the buffer data from that csvParsed in order to upload it to the amazon AWS S3. The problem is right here, that the Body: fileStream.path is supposed to show the content of the file but unfortunately, it logs out like this which coming from the fs.createReadStream
'{"type":"Buffer","data":[80,75,3,4,10,0,0,0,0,0,249,117,199,78,214,146,124
async uploadSample(file){
const {aws} = this.config
AWS.config.update({
secretAccessKey: aws.secretAccessKey,
accessKeyId: aws.accessKeyId,
region: 'us-east-2'
})
const bufferObject = new Buffer.from(JSON.stringify(file))
/*** WE NEED THE FILE SYSTEM IN ORDER TO STORE */
const fileStream = fs.createReadStream(bufferObject)
const uploadParams = {Bucket: aws.bucket, Key: aws.key, Body: fileStream.path}
const s3 = new AWS.S3()
await s3.upload(uploadParams,null,function(error, file){
if(error){
console.log(error)
} else {
console.log('Successfully uploaded')
}
})
}
All of my function will be executed in the server.js. So if you have a look at this then you can actually get the whole picture of the problem
app.get('/systems/parsed', async(req, res) => {
const Sample = await Sample()
//Fetch the data from an API
const systems = await Cache.remember('systems', async() => {
return Sample.systems()
})
const integration = await IntegrationInstance()
/** GET THE RESPONSE DATA AND PUT THEM IN A CSV FILE*/
const result = await integration.generateCsvToBuffer(systems)
const aws = await AwsInstance()
/*** GET THE SYSTEMS FILE (CSV FILE) THEN UPLOAD THEM INTO THE AWS S3 BUCKET*/
const awsUpload = await aws.uploadWorkedWithBuffer(result)
return res.send(awsUpload);
})
My only concern here is that, the file has successfully uploaded to the Amazon AWS S3, but the content of the file is still in Buffer. Any help on the existing function / any shorter way would much appreciate.
Here's my summarize again: fetch data from a server -> put on the Csv file as buffer BUT from a web browser -> and from there upload it to Amazon AWS S3 bucket -> Problem is file is uploaded but the content of the file is still in buffer.
It looks like you are making things more complicated than necessary here. According to the documentation .upload you can pass a buffer to the upload directly instead of creating a stream from the buffer. I suspect your underlying issue is passing the path from the stream instead of the stream itself though.
I actually solved it.
First, whenever you created the function generateCsvToBuffer remember to have a bookType on your wb (Workbook) in order for s3 to recognize it. The function should be something like this
async generateCsvToBuffer(json){
const {aws} = this.config
var ws = xlsx.utils.json_to_sheet(json)
var wb = xlsx.utils.book_new();
await xlsx.utils.book_append_sheet(wb, ws, 'Systems')
const csvParsed = xlsx.write(wb, { type: 'buffer', bookType: 'csv'})
return csvParsed;
}
Second, you have to import Content-Disposition: attachment into the uploadParams for the Aws Configuration
async uploadSample(file){
const {aws} = this.config
AWS.config.update({
secretAccessKey: aws.secretAccessKey,
accessKeyId: aws.accessKeyId,
region: 'us-east-2'
})
const bufferObject = new Buffer.from(JSON.stringify(file))
/*** WE NEED THE FILE SYSTEM IN ORDER TO STORE */
const fileStream = fs.createReadStream(bufferObject)
const uploadParams = {Bucket: aws.bucket, Key: aws.key, Body: fileStream.path}
const s3 = new AWS.S3()
await s3.upload(uploadParams,null,function(error, file){
if(error){
console.log(error)
} else {
console.log('Successfully uploaded')
}
})
}

Busboy Save Stream For Use Later

I'm trying to use busboy to allow clients to upload files to my Express web server.
I have the following middleware function I'm running for Express.
module.exports = (req, res, next) => {
req.files = {};
let busboy;
try {
busboy = new Busboy({
headers: req.headers
});
} catch (e) {
return next();
}
busboy.on("file", (fieldname, file, filename, encoding, mimetype) => {
req.files[fieldname] = {
file,
filename,
encoding,
mimetype
};
// Need to call `file.resume` to consume the stream somehow (https://stackoverflow.com/a/24588458/894067)
file.resume();
});
busboy.on("finish", next);
req.pipe(busboy);
};
As you can see, I had to add file.resume(); so that the "finish" event would be triggered, and call the next function for the middleware (https://stackoverflow.com/a/24588458/894067).
The problem is, later on, when I want to consume the stream, it says readable: false. So I'm assuming the file.resume(); discards the stream and doesn't allow it to be used in the future.
I basically want to get all the uploaded files and information associated with those files, store them on the req.files object, then consume the streams later, or not consume them if I don't want to use it. That way they remain streams and don't take up much memory, until I'm ready to consume the stream and actually do something with it (or choose to discard it).
What can I use in place of file.resume(); to ensure that the "finish" event get triggers, while allowing me to use the stream later on in the lifecycle of the request (the actual app.post routes, instead of middleware)?
The client might also upload multiple files. So I need any solution to handle multiple files.
Would it make any sense to pipe the input stream into a PassThrough stream, like this?
const Busboy = require('busboy')
const { PassThrough } = require('stream')
const multipart = (req, res, next) => {
req.files = new Map()
req.fields = new Map()
const busboy = new Busboy({ headers: req.headers })
busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
const stream = new PassThrough()
file.pipe(stream)
req.files.set(fieldname, { stream, filename, encoding, mimetype })
})
busboy.on(
'field',
(fieldname, val, fieldnameTruncated, valTruncated, encoding, mimetype) => {
req.fields.set(fieldname, { val, encoding, mimetype })
}
)
busboy.on('error', (error) => {
next(error)
})
busboy.on('finish', () => {
next()
})
busboy.end(req.rawBody)
}
If you want to handle multiple files in a single request, the procedure is a bit tricky.
Busboy goes through a single stream and fires events whenever files arrive (in sequence). You cannot get separate streams for all files at the same time with Busboy. This is not a limitation from the library, this is how HTTP works.
Your best option would be to store all files in a temporary storage, and keep information for the next middlewares with res.locals :
const Busboy = require('busboy');
const path = require('path');
const fs = require('fs');
module.exports = (req, res, next) => {
res.locals.files = {};
// You need to ensure the directory exists
res.locals.someTemporaryDirectory = '/some/temp/dir/with/randomString/in/it';
let busboy;
try {
busboy = new Busboy({
headers: req.headers
});
} catch (e) {
return next(e);
}
busboy.on("file", (fieldname, file, filename, encoding, mimetype) => {
res.locals.files[fieldname + '_' + filename] = {
filename,
encoding,
mimetype
};
// I skipped error handling for the sake of simplicity. Cleanup phase will be required as well
const tempFilePath = path.join(res.locals.someTemporaryDirectory, fieldname + '_' + filename);
file.pipe(fs.createWriteStream(tempFilePath));
});
busboy.on("finish", next);
req.pipe(busboy);
};
The next middleware shall use res.locals.someTemporaryDirectory and res.locals.files to mind their businesses (that will require a clean-up phase).
This solution may seem sub-optimal, but HTTP is like it is. You may want to issue a separate HTTP request for each file instead, but I would not recommend it as you would encounter a bunch of other issues (such as synchronization of all requests + memory management).
Whatever the solution is, it requires to get your hands dirty.

Upload Image from Google Cloud Function to Cloud Storage

I'm attempting to handle file uploads using a Google Cloud Function. This function uses Busboy to parse the multipart form data and then upload to Google Cloud Storage.
I keep receiving the same error: ERROR: { Error: ENOENT: no such file or directory, open '/tmp/xxx.png' error when triggering the function.
The error seems to occur within the finish callback function when storage.bucket.upload(file) attempts to open the file path /tmp/xxx.png.
Note that I can't generate a signed upload URL as suggested in this question since the application invoking this is an external, non-user application. I also can't upload directly to GCS since I'll be needing to make custom filenames based on some request metadata. Should I just be using Google App Engine instead?
Function code:
const path = require('path');
const os = require('os');
const fs = require('fs');
const Busboy = require('busboy');
const Storage = require('#google-cloud/storage');
const _ = require('lodash');
const projectId = 'xxx';
const bucketName = 'xxx';
const storage = new Storage({
projectId: projectId,
});
exports.uploadFile = (req, res) => {
if (req.method === 'POST') {
const busboy = new Busboy({ headers: req.headers });
const uploads = []
const tmpdir = os.tmpdir();
busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
const filepath = path.join(tmpdir, filename)
var obj = {
path: filepath,
name: filename
}
uploads.push(obj);
var writeStream = fs.createWriteStream(obj.path);
file.pipe(writeStream);
});
busboy.on('finish', () => {
_.forEach(uploads, function(file) {
storage
.bucket(bucketName)
.upload(file.path, {name: file.name})
.then(() => {
console.log(`${file.name} uploaded to ${bucketName}.`);
})
.catch(err => {
console.error('ERROR:', err);
});
fs.unlinkSync(file.path);
})
res.end()
});
busboy.end(req.rawBody);
} else {
res.status(405).end();
}
}
I eventually gave up on using Busboy. The latest versions of Google Cloud Functions support both Python and Node 8. In node 8, I just put everything into async/await functions and it works fine.

Create and Send Zip file -NODE JS

I'm trying to create and then send zip file to client. I know how to create it but I've got a problem with send it to client. I tried many ways.
I'm sending POST request from Client and as response I want to send a file.
This is my server-site example code
var Zip = require('node-zip');
router.post('/generator', function(req, res, next) {
var zip = new Zip;
zip.file('hello.txt', 'Hello, World!');
var options = {base64: false, compression:'DEFLATE'};
fs.writeFile('test1.zip', zip.generate(options), 'binary', function (error) {
console.log('wrote test1.zip', error);
});
res.setHeader('Content-disposition', 'attachment; filename=test1.zip');
res.download('test1.zip');
}
});
I also tried something like this:
res.setHeader('Content-disposition', 'attachment; filename=' + filename);
res.setHeader('Content-type', mimetype);
var filestream = fs.createReadStream(file);
filestream.pipe(res);
I tried to use such libraries as:
node-zip
archiver
Can anyone explain me how to do that ?
This module works fine too: https://www.npmjs.com/package/adm-zip
Example without creating temporary zip file in server:
var AdmZip = require('adm-zip');
router.get('/zipFilesAndSend', function(req, res) {
var zip = new AdmZip();
// add local file
zip.addLocalFile("./uploads/29/0046.xml");
// get everything as a buffer
var zipFileContents = zip.toBuffer();
const fileName = 'uploads.zip';
const fileType = 'application/zip';
res.writeHead(200, {
'Content-Disposition': `attachment; filename="${fileName}"`,
'Content-Type': fileType,
})
return res.end(zipFileContents);
});
Try this express-easy-zip npm package to generate a zip file from a local folder path and send it as a download to the client.
var zip = require('express-easy-zip');
var app = require('express')();
app.use(zip());
app.get('my-route/zip', async function(req, res) {
var dirPath = __dirname + "/uploads";
await res.zip({
files: [{
path: dirPath,
name: 'Package'
}],
filename: 'Package.zip'
});
});
I haven't worked with node-zip or archiver before (I usually just use the built-in zlib module), but one thing I noticed right away is that you should place res.download inside the callback of writeFile. That way it will only send the file once it has been fully written to disk.
fs.writeFile('test1.zip', zip.generate(options), 'binary', function (error) {
res.download('test1.zip');
});
I hope this solution works for you, if it doesn't feel free to comment.
Also, I think res.download sets the Content-disposition header for you, you don't need to set it manually. Not 100% sure on that one though.
Above solutions work.(above solutions generate zip and send it to frontend as data in response. In order to make it as downloadable following code will work) I was using express-zip. It is compressing files and sending data to frontend from backend(node). But in frontend I was getting only data in response. In my case I want user can be able to download the zip which sent by server. To solve this I followed following approach. For generating download window in browser i used downloadjs (we can follow another approach but i find this easy)
Front-End
const download = require('downloadjs')
return axios({
url:process.env.API_HOST+'/getuploadedfiles',
method:'get',
headers:{
'Content-Type': 'multipart/form-data',
withCredentials:true,
},
responseType:'arraybuffer' // If we don't mention we can't get data in desired format
})
.then(async response => {
console.log("got al files in api ");
let blob = await new Blob([response.data], { type: 'application/zip' }) //It is optional
download(response.data,"attachement.zip","application/zip") //this is third party it will prompt download window in browser.
return response.data;
})
Backe-End
const zip = require('express-zip');
app.use('/getuploadedfiles',function(req,res){
res.zip([
{path:'/path/to/file/file2.PNG',name:'bond.png'},
{path:'/path/to/file/file1.PNG',name:'james.png'}
])

Categories

Resources