How to receive file from nodejs with axios - javascript

I need to axios to get a pdf buffer from nodejs and more some data info.
If I set the axios post with responseType: "arraybuffer", and I make node send only the pdf buffer it is working.
But I need to return some more data info from node and if I get rid off the "responseType:arrayBuffer", in order to receive a json, I am unable to convert pdfBuffer to pdf with new Blob function.
I get BLOB invalid.
What am I doing wrong?
This is not working:
//front end:
const resp=await axios.post("atestadosClientes/generatePDF", data);
const file = new Blob([response.data.pdfBuffer], {
type: "application/pdf",
});
//Build a URL from the file
const fileURL = URL.createObjectURL(file);
//Node response:generate pdf buffer
return res.send({
message: "Success",
id: resp[0].insertId,
pdfBuffer: pdfSignedBuffer, //pdf buffer
});
This is working:
//front end:
const resp=await axios.post("atestadosClientes/generatePDF", data, {
responseType: "arraybuffer",
});
},
const file = new Blob([response.data], {
type: "application/pdf",
});
//Build a URL from the file
const fileURL = URL.createObjectURL(file);
//Node response:generate pdf buffer
return res.send(pdfBuffer)

Related

Send binary response from UInt8Array in Express.js

I am using Express.js with Typescript and I would like to send a UInt8Array as binary data.
This is what I use so far and it works, but I would like not to save the file before, because I think it wastes performance:
const filePath = path.resolve(__dirname, 'template.docx');
const template = fs.readFileSync(filePath);
const buffer: Uint8Array = await createReport({
template,
data: {
productCode: data.productCode,
},
});
fs.writeFileSync(path.resolve(__dirname, 'output.docx'), buffer);
res.sendFile(path.resolve(__dirname, 'output.docx'));
I am using docx-templates to generate the file by the way.
You can use a PassThrough stream for this purpose, it'll keep the file in memory with no need to write to disk.
Something like this should do it:
const stream = require("stream");
const readStream = new stream.PassThrough();
// Pass your output.docx buffer to this
readStream.end(buffer);
res.set("Content-disposition", 'attachment; filename=' + "output.docx");
res.set("Content-Type", "application/vnd.openxmlformats-officedocument.wordprocessingml.document");
readStream.pipe(res);
The complete node.js code:
const fs = require("fs");
const express = require("express");
const port = 8000;
const app = express();
const stream = require("stream");
app.get('/download-file', (req, res) => {
const buffer = fs.readFileSync("./test.docx");
console.log("/download-file: Buffer length:", buffer.length);
const readStream = new stream.PassThrough();
readStream.end(buffer);
res.set("Content-disposition", 'attachment; filename=' + "test.docx");
res.set("Content-Type", "application/vnd.openxmlformats-officedocument.wordprocessingml.document");
readStream.pipe(res);
});
app.listen(port);
console.log(`Serving at http://localhost:${port}`);
To test, add a 'test.docx' file to the same directory, then point your browser to http://localhost:8000/download-file
Terry,
Thanks for the update of your answer and providing the full code. However, it still does not help much. I am trying to understand how I can handle this on the front-end side, in my case in Vue. Here is the following code:
router.post('/chart/word', async (req, res, next) => {
try {
if (!req.body.chartImage) throw new BadRequest('Missing the chart image from the request body')
const wordTemplate = await s3GetFile('folder', 'chart-templates-export/charts-template.docx')
const template = wordTemplate.Body
const buffer = await createReport({
cmdDelimiter: ["{", "}"],
template,
additionalJsContext: {
chart: () => {
const dataUrl = req.body.chartImage.src
const data = dataUrl.slice("data:image/jpeg;base64,".length);
return { width: 18 , height: 12, data, extension: '.jpeg' }
}
}
})
const stream = require('stream')
const readStream = new stream.PassThrough()
readStream.end(buffer)
res.set("Content-disposition", 'attachment; filename=' + "output.docx")
res.set("Content-Type", "application/vnd.openxmlformats-officedocument.wordprocessingml.document")
readStream.pipe(res)
} catch (err) {
console.log(err)
next(err)
}
})
And here is my Vue code, tested various stuff, but nothing...:
async exportCharts() {
console.log('this.$refs.test: ', this.$refs.test)
let img = {
src: this.$refs.test.getDataURL({
type: 'jpeg',
pixelRatio: window.devicePixelRatio || 1,
backgroundColor: '#fff'
}),
width: this.$refs.test.getWidth(),
height: this.$refs.test.getHeight()
}
const answersReq = await this.axios({
method: 'post',
url: '/pollAnswers/chart/word',
data: {
chartImage: img
}
responseType: 'arraybuffer' // 'blob' // 'document'
})
console.log('answersReq: ', answersReq)
if (answersReq.data) {
downloadURL(answersReq.data, 'report.docx')
}
}
What I am basically doing is: sending an image to the API (taken from html vue-echart element), then inserting it in a docx template, by using docx-templates library, which returns me Uint8Array that I want to export as the new Word Document with the populated charts. Then, the user (on the UI) should be able to choose the destination.
Here is the code for the download URL:
export function downloadURL(data, fileName) {
const mimeType = 'application/vnd.openxmlformats-officedocument.wordprocessingml.document'
const blob = new Blob([data], { type: mimeType })
const url = URL.createObjectURL(blob)
const element = document.createElement('a')
element.href = url
element.download = fileName
element.style.display = 'none'
document.body.appendChild(element)
element.click()
URL.revokeObjectURL(element.href)
document.body.removeChild(element)
}
P.S. Just to mention, if I directly save the buffer (the Uint8Array returned from the createReport) in the API, it works, the file is downloaded successfully and I can read it without any problems - it populates the correct chart in the file.
UPDATE:
I figured that out, but I am not sure why this is necessary and why it works that way and not the other. So, in the /chart/word endpoint, I am converting the Uint8Array buffer into a stream, then passing it as a response (the same way you used). Afterwards, in the Vue, I fetched this as responseType: 'arraybuffer', which converted the stream response into Uint8Array buffer again, then, I used the same method for the download and it works. Initially, I tried to send directly the buffer (without converting it as stream as you mentioned), but then on the front-end, the response was received as object that contained the Uint8Array buffer values, which was not what is expected and I could not create legit docx file. So, for some reason, it is required to convert the buffer as stream in the API, before sending it as response. Afterwards, on the front-end, I have to convert it back to arraybuffer and, finally, to make the docx download.
If you can explain to me why it works like that, I will be very happy.

"Unsupported body payload object" when trying to upload to Amazon S3

I want to upload a file from my frontend to my Amazon S3 (AWS).
I'm using dropzone so I convert my file and send it to my backend.
In my backend my file is like:
{ fieldname: 'file',
originalname: 'test.torrent',
encoding: '7bit',
mimetype: 'application/octet-stream',
buffer: { type: 'Buffer', data: [Array] },
size: 7449 },
and when I try to upload my file with my function:
var file = data.patientfile.file.buffer;
var params = { Bucket: myBucket, Key: data.patientfile.file.fieldname, Body: file };
s3.upload(params, function (err, data) {
if (err) {
console.log("******************",err)
} else {
console.log("Successfully uploaded data to myBucket/myKey");
}
});
I get as error:
Unsupported body payload object
Do you know how can I send my file?
I have tried to send it with putobject and get a similar error.
I think you might need to convert the file content (which probably in this case is the data.patientfile.file.buffer) to binary
var base64data = new Buffer(data, 'binary');
so the params would be like:
var params = { Bucket: myBucket, Key: data.patientfile.file.fieldname, Body: base64data };
Or if I'm mistaken and the buffer is already in binary, then you can try:
var params = { Bucket: myBucket, Key: data.patientfile.file.fieldname, Body: data.patientfile.file.buffer};
This is my production code that is working.
Please note the issue can happen at data1111.
But, to get full idea, add all key parts of working code below.
client:
// html
<input
type="file"
onChange={this.onFileChange}
multiple
/>
// javascript
onFileChange = event => {
const files = event.target.files;
var file = files[0];
var reader = new FileReader();
reader.onloadend = function(e) {
// save this data1111 and send to server
let data1111 = e.target.result // reader.result // ----------------- data1111
};
reader.readAsBinaryString(file);
}
server:
// node.js/ javascript
const response = await s3
.upload({
Bucket: s3Bucket, // bucket
Key: s3Path, // folder/file
// receiving at the server - data1111 - via request body (or other)
Body: Buffer.from(req.body.data1111, "binary") // ----------------- data1111
})
.promise();
return response;
To make the above code working, it took full 2 days.
Hope this helps someone in future.
Implemented Glen k's answer with nodejs ...worked for me
const AWS = require('aws-sdk');
const s3 = new AWS.S3({
accessKeyId: process.env.AWSAccessKeyID,
secretAccessKey: process.env.AWSSecretAccessKey,
});
let base64data = Buffer.from(file.productImg.data, 'binary')
const params = {
Bucket: BUCKET_NAME,
Key: KEY,
Body: base64data
}
s3.upload(params, function(err, data) {
if (err) {
console.log(err)
throw err;
}
console.log(data)
console.log(`File uploaded successfully. ${data.Location}`);
})

s3 isn't uploading file and getting error of SignatureDoesNotMatch

I'm trying to add images to my s3 bucket in aws, but it doesn't seem to work. I get the error of SignatureDoesNotMatch
Here's how I'm uploading the file/image:
FrontEnd
const file = e.target.files[0];
const fileParts = file.name.split('.');
const fileName = fileParts[0];
const fileType = fileParts[1];
const response = axios.post('api/aws/sign_s3', { fileName, fileType );
Backend
router.post('/sign_s3', async (req, res) => {
aws.config.update({
accessKeyId: config.aws.accessKey,
secretAccessKey: config.aws.secretKey,
region: 'us-west-1'
});
const s3 = new aws.S3(); // Create a new instance of S3
const fileName = req.body.fileName;
const fileType = req.body.fileType;
const s3Params = {
Bucket: config.aws.bucketName,
Key: fileName,
Expires: 500,
ContentType: fileType,
ACL: 'public-read'
};
s3.getSignedUrl('putObject', s3Params, (err, data) => {
if (err) return res.send(err);
const returnData = {
signedRequest: data,
url: `https://sim-to-do.s3.amazonaws.com/${fileName}`
};
res.json({ success: true, responseData: returnData });
});
});
I get two urls. When I go to the first one, I get the following error code:
SignatureDoesNotMatch
Error Message
The request signature we calculated does not match the signature you provided. Check your key and signing method.
What am I doing wrong? What's the correct way of uploading a file to aws s3?
I was able to fix this issue after removing the Content-Type from the headers.
If you get "Signature does not match", it's highly likely you used a wrong secret access key. Can you double-check access key and secret access key to make sure they're correct?
from awendt answer

Display pdf sent from express server to react side

I am sending a pdf file from express server like this -
app.get("/pdf", (req, res) => {
var file = fs.createReadStream('./public/file.pdf');
res.contentType("application/pdf");
res.send(file);
})
I want to display it on react side (Note- display pdf , not download)
I tried converting it into blob but it shows blank pdf
I am calling below function on click in react end
viewHandler = async () => {
axios.get('http://localhost:4000/pdf')
.then(response => {
console.log(response.data);
//Create a Blob from the PDF Stream
const file = new Blob(
[response.data],
{ type: 'application/pdf' });
//Build a URL from the file
const fileURL = URL.createObjectURL(file);
//Open the URL on new Window
console.log("Asdasd");
window.open(fileURL);
})
.catch(error => {
console.log(error);
});
};
I want to display pdf without using static url to server pdf file
like -
<object data='http://localhost:4000/file.pdf'
type='application/pdf'
width='100%'
height='700px' />

Using signed requests with AWS S3 and uploading photos?

So I have a react native application that's kind of like slack and I'm trying to do image uploads to s3.
I went with getSignedUrl route.
So the client pics a photo,
fetches a signed url to the bucket
then changes the url on the server for that user
then a put request to the signed url that was fetched.
It mostly works the files get in the right bucket and they are photos. but
A) the link makes me download the file instead of displaying it in browser.
B) the file isn't an image...its an xml file and can only be opened in photoshop
I've tried changing the type in the data.append type,
Adding header to the signed request
Adding x-amz- headers to the signed request
hard coding the file type in server
converting image to base64 string with a native module but It still is coming up wrong.
Client Side calls to server
uploadToServer() {
// alert('coming soon!');
//Go back to profile page
this.props.navigation.goBack();
//grab user from navigator params
let user = this.props.navigation.state.params.user
let pic = this.state.selected;
// turn uri into base64
NativeModules.ReadImageData.readImage(pic.uri, (image) => {
console.log(image);
var data = new FormData();
data.append('picture', {
uri: image,
name: pic.filename,
type: 'image/jpeg'
});
//get the signed Url for uploading
axios.post(api.getPhotoUrl, {fileName: `${pic.filename}`}).then((res) => {
console.log("get Photo URL response", res);
//update the user with the new url
axios.patch(api.fetchUserByID(user.id), {profileUrl: res.data.url}).then((resp) => {
console.log("Update User response", resp.data);
}).catch(err => errorHandler(err));
//upload the photo using the signed request url given to me.
//DO I NEED TO TURN DATA INTO A BLOB?
fetch(res.data.signedRequest, {
method: 'PUT',
body: data
}).then((response) => {
console.log("UPLOAD PHOTO RESPONSE: ", response);
}).catch(err => errorHandler(err))
}).catch((err) => errorHandler(err))
})
}
GET SIGNED URL logic from on out
router.post('/users/sign-s3', (req, res) => {
const s3 = new aws.S3({signatureVersion: 'v4', region: 'us-east-2'});
const fileName = `${req.user.id}-${req.body.fileName}`;
const fileType = req.body.fileType;
const s3Params = {
Bucket: AWS_S3_BUCKET,
Key: `images/${fileName}`,
Expires: 60,
ContentType: 'image/jpeg',
ACL: 'public-read'
};
s3.getSignedUrl('putObject', s3Params, (err, data) => {
if (err) {
console.log(err);
return res.end();
}
const returnData = {
signedRequest: data,
url: `https://${AWS_S3_BUCKET}.s3.amazonaws.com/${s3Params.Key}`
};
res.write(JSON.stringify(returnData));
res.end();
return null;
});
});
You need to change content type from image to supported xml format if you want it to be displayed in browser.
Refer this and set content type accordingly.

Categories

Resources