Piping zip file from SailsJS backend to React Redux Frontend - javascript

I have a SailsJS Backend where i generate a zip File, which was requested by my Frontend, a React App with Redux. I'm using Sagas for the Async Calls and fetch for the request. In the backend, it tried stuff like:
//zipFilename is the absolute path
res.attachment(zipFilename).send();
or
res.sendfile(zipFilename).send();
or
res.download(zipFilename)send();
or pipe the stream with:
const filestream = fs.createReadStream(zipFilename);
filestream.pipe(res);
on my Frontend i try to parse it with:
parseJSON(response) => {
return response.clone().json().catch(() => response.text());
}
everything i tried ends up with an empty zip file. Any suggestions?

There are various issues with the options that you tried out:
res.attachment will just set the Content-Type and Content-Disposition headers, but it will not actually send anything.
You can use this to set the headers properly, but you need to pipe the ZIP file into the response as well.
res.sendfile: You should not call .send() after this. From the official docs' examples:
app.get('/file/:name', function (req, res, next) {
var options = { ... };
res.sendFile(req.params.name, options, function (err) {
if (err) {
next(err);
} else {
console.log('Sent:', fileName);
}
});
});
If the ZIP is properly built, this should work fine and set the proper Content-Type header as long as the file has the proper extension.
res.download: Same thing, you should not call .send() after this. From the official docs' examples:
res.download('/report-12345.pdf', 'report.pdf', function(err) { ... });
res.download will use res.sendfile to send the file as an attachment, thus setting both Content-Type and Content-Disposition headers.
However, you mention that the ZIP file is being sent but it is empty, so you should probably check if you are creating the ZIP file properly. As long as they are built properly and the extension is .zip, res.download should work fine.
If you are building them on the fly, check this out:
This middleware will create a ZIP file with multiples files on the fly and send it as an attachment. It uses lazystream and archiver
const lazystream = require('lazystream');
const archiver = require('archiver');
function middleware(req, res) {
// Set the response's headers:
// You can also use res.attachment(...) here.
res.writeHead(200, {
'Content-Type': 'application/zip',
'Content-Disposition': 'attachment; filename=DOWNLOAD_NAME.zip',
});
// Files to add in the ZIP:
const filesToZip = [
'assets/file1',
'assets/file2',
];
// Create a new ZIP file:
const zip = archiver('zip');
// Set up some callbacks:
zip.on('error', errorHandler);
zip.on('finish', function() {
res.end(); // Send the response once ZIP is finished.
});
// Pipe the ZIP output to res:
zip.pipe(res);
// Add files to ZIP:
filesToZip.map((filename) => {
zip.append(new lazystream.Readable(() => fs
.createReadStream(filename), {
name: filename,
});
});
// Finalize the ZIP. Compression will start and output will
// be piped to res. Once ZIP is finished, res.end() will be
// called.
zip.finalize();
}
You can build around this to cache the built ZIPs instead of building them on the fly every time, which is time and resource consuming and totally unadvisable for most uses cases.

Related

How do I set/update AWS s3 object metadata during image upload PUT request to signed url?

I'm trying to include the name of the file that is uploaded to AWS S3 and given a random/unique name in a NextJS app.
I can set metadata from the backend, but I would like to update it from my put request (where the image is actually uploaded) to the signed URL. How would I do this?
To be clear: I would like to set metadata when I do a PUT request to the signed URL. I currently have it set to "none" on the backend to avoid forbidden errors (and this shows up as metadata in s3). Is this possible to update that metadata from my PUT request is there another approach I should take? Thanks!
// Backend code to get signed URL
async function handler(req, res) {
if (req.method === 'GET') {
const key = `content/${uuidv4()}.jpeg`;
s3.getSignedUrl(
'putObject',
{
Bucket: 'assets',
ContentType: 'image/jpeg',
Key: key,
Expires: 5 * 60,
Metadata: {
'file-name': "none",
}
},
(err, url) => res.send({ key, url })
);
}
// Frontend code
const [file, setFile] = useState(null);
const onFileUpload = async (e) => {
e.preventDefault();
const uploadConfig = await fetch('/api/upload');
const uploadURL = await uploadConfig.json();
await fetch(uploadURL.url, {
body: file,
method: 'PUT',
headers: {
'Content-Type': file.type,
'x-amz-meta-file-name': 'updated test name',
},
});
};
It isn't possible to do that with presigned urls. When you create the presigned url, all properties are pre-filled. You can't change them, when uploading the file. The only thing you can control is the object's data. The bucket, the key and the metadata (and all other parameters of put_object) are predefined. This is also the case for generate_presigned_post. All fields are prefilled.
This makes sense, as the back-end grants the permissions and needs to decide on these. Also the implementation will be much more complicated, as presigned urls support all client methods, which have different parameters.
The only way you could do it, is to generate urls on-demand. First generate the pre-signed url, based on the name selected by the client and then do the upload. You will need two round-trips for every file. One to your server, for generating the url and one to S3 for the uploading.

Downloading a zip file from a given path in express api + react

So I'm completely lost at this point. I have had a mixture of success and failure but I can't for the life of me get this working. So I'm building up a zip file and storing it in a folder structure that's based on uploadRequestIds and that all works fine. I'm fairly new to the node but all I want is to take the file that was built up which is completely valid and works if you open it once it's been constructed in the backend and then send that on to the client.
const prepareRequestForDownload = (dirToStoreRequestData, requestId) => {
const output = fs.createWriteStream(dirToStoreRequestData + `/Content-${requestId}.zip`);
const zip = archiver('zip', { zlib: { level: 9 } });
output.on('close', () => { console.log('archiver has been finalized.'); });
zip.on('error', (err) => { throw err; });
zip.pipe(output);
zip.directory(dirToStoreRequestData, false);
zip.finalize();
}
This is My function that builds up a zip file from all the files in a given directory and then stores it in said directory.
all I thought I would need to do is set some headers to have an attachment disposition type and create a read stream of the zip file into the res.send function and then react would be able to save the content. but that just doesn't seem to be the case. How should this be handled on both the API side from reading the zip and sending to the react side of receiving the response and the file auto-downloading/requesting a user saves the file.
This is what the temp structure looks like
There is some strategies to resolve it, all browser when you redirect to URL where extension ending with .zip, normally start downloading. What you can do is to return to your client the path for download something like that.
http://api.site.com.br/my-file.zip
and then you can use:
window.open('URL here','_blank')

Electron upload with progress

I have an Electron app which is able to upload very big files to the server via HTTP in renderer process without user input. I decided to use axios as my HTTP client and it was able to retrieve upload progress but with this I met few problems.
Browser's supported js and Node.js aren't "friendly" with each other in some moments. I used fs.createReadStream function to get the file but axios does not understand what ReadStream object is and I can't pipe (there are several topics on their GitHub issue tab but nothing was done with that till now) this stream to FormData (which I should place my file in).
I ended up using fs.readFileSync and then form-data module with its getBuffer() method but now my file is loaded entirely in the memory before upload and with how big my files are it kills Electron process.
Googling I found out about request library which in-fact is able to pipe a stream to request but it's deprecated, not supported anymore and apparently I can't get upload progress from it.
I'm running out of options. How do you upload files with Electron without user input (so without file input) not loading them in the memory upfront?
P.S. on form-data github page there is a piece of code explaining how to upload a file stream with axios but it doesn't work, nothing is sent and downgrading the library as one issue topic suggested didn't help either...
const form = new FormData();
const stream = fs.createReadStream(PATH_TO_FILE);
form.append('image', stream);
// In Node.js environment you need to set boundary in the header field 'Content-Type' by calling method `getHeaders`
const formHeaders = form.getHeaders();
axios.post('http://example.com', form, {
headers: {
...formHeaders,
},
})
.then(response => response)
.catch(error => error)
I was able to solve this and I hope it will help anyone facing the same problem.
Since request is deprecated I looked up for alternatives and found got.js for NodeJS HTTP requests. It has support of Stream, fs.ReadStream etc.
You will need form-data as well, it allows to put streams inside FormData and assign it to a key.
The following code solved my question:
import fs from 'fs'
import got from 'got'
import FormData from 'form-data'
const stream = fs.createReadStream('some_path')
// NOT native form data
const formData = new FormData()
formData.append('file', stream, 'filename');
try {
const res = await got.post('https://my_link.com/upload', {
body: formData,
headers: {
...formData.getHeaders() // sets the boundary and Content-Type header
}
}).on('uploadProgress', progress => {
// here we get our upload progress, progress.percent is a float number from 0 to 1
console.log(Math.round(progress.percent * 100))
});
if (res.statusCode === 200) {
// upload success
} else {
// error handler
}
} catch (e) {
console.log(e);
}
Works perfectly in Electron renderer process!

POST cutting off PDF data

I posted a question yesterday (linked here) where I had been trying to send a PDF to a database, and then retrieve it a later date. Since then I have been advised that it is best to (in my case as I cannot use Cloud Computing services) to upload the PDF files to local storage, and save the URL of the file to the database instead. I have now begun implementing this, but I have come across some trouble.
I am currently using FileReader() as documented below to process the input file and send it to the server:
var input_file = "";
let reader = new FileReader();
reader.readAsText(document.getElementById("input_attachment").files[0]);
reader.onloadend = function () {
input_file = "&file=" + reader.result;
const body = /*all the rest of my data*/ + input_file;
const method = {
method: "POST",
body: body,
headers: {
"Content-type": "application/x-www-form-urlencoded"
}
};
After this bloc of code I do the stock standard fetch() and a route on my server receives this. Almost all data comes in 100% as expected, but the file comes in cut off somewhere around 1300 characters in (making it quite an incomplete PDF). What does appear to come in seems to match the first 1300 characters of the original PDF I uploaded.
I have seen suggestions that you are meant to use "multipart/form-data" content-type to upload files, but when I do this I seem to only then receive the first 700 characters or so of my PDF. I have tried using the middleware Multer to handle the "multipart/form-data" but it just doesn't seem to upload anything (though I can't guarantee that I am using it correctly).
I also initially had trouble with fetch payload too large error message, but have currently resolved this through this method:
app.use(bodyParser.urlencoded({ limit: "50mb", extended: false, parameterLimit: 50000 }));
Though I have suspicions that this may not be correctly implemented as I have seen some discussion that the urlencoded limit is set prior to the file loading, and cannot be changed in the middle of the program.
Any and all help is greatly appreciated, and I will likely use any information here to construct an answer on my original question from yesterday so that anybody else facing these sort of issues have a resource to go to.
I personally found the solution to this problem as follows. On the client-side of my application this code is an example of what was implemented.
formData = new FormData();
formData.append("username", "John Smith");
formData.append("fileToUpload", document.getElementById("input_attachment").files[0]);
const method = {
method: "POST",
body: formData
};
fetch(url, method)
.then(res => res.json())
.then(res => alert("File uploaded!"))
.catch(err => alert(err.message))
As can be noted I have changed from using "application/x-www-form-urlencoded" encoding to "multipart/form-data" to upload files. nodeJS and Express however do not natively support this encoding type. I chose to use the library Formidable (found this to be easiest to use without too much overhead) which can be investigated about here. Below is an example of my server-side implementation of this middleware (Formidable).
const express = require('express');
const app = express();
const formidable = require('formidable');
app.post('/upload', (req, res) => {
const form = formidable({ uploadDir: `${__dirname}/file/`, keepExtensions: true });
form.parse(req, (err, fields, files) => {
if (err) console.log(err.stack);
else {
console.log(fields.username);
});
});
The file(s) are automatically uploaded to the directory specified in uploadDir, and the keepExtensions ensures that the file extension is saved as well. The non-file inputs are accessible through the fields object as seen through the fields.username example above.
From what I have found, this is the easiest method to take to setup an easy file upload system.

How to return files saved on a file system with Node js and Multer to angular front-end?

I'm new to programming with angular and node js, and I need to return the files that are saves in a file system (handled by the backend in node js) to the front end, to give the user the option to view them or download them, to save them I used the multer middleware, but to bring them back for the front end I not found a effective solution.
I tried using fs to create a buffer array, but it didn't work.
Does anyone jnow an effective solution?
In the request will be passed parameters to identify which file returns, but for now I'm testing with a static file.
My request :
let headers: Headers = new Headers();
headers.append('Content-type', 'application/json');
headers.append('Authorization', token);
let link = ${URL_AuthAPI}/systemUsers/list;
let body = JSON.stringify({ obj });
let option = new RequestOptions({ headers: headers });
return this.http.post(link, body, option).map((resposta: Response)=> resposta);
Nodejs Server:
var filePath = path.join("/home", 'rg.png');
var stat = fileSystem.statSync(filePath);
res.writeHead(200, {
'Content-Type': 'image/png',
'Content-Length': stat.size,
// 'Content-Disposition': 'attachment ; filename=teste.png'
});
var readStream = fileSystem.readFileSync(filePath);
readStream.on('data', function(data) {
response.write(data);
});
readStream.on('end', function() {
response.end();
});
Component Code:
this.systemUsersService.listUsers(this.token, null).subscribe((apiResponse) => {
var data = apiResponse['_body'];
console.log(data);
}, (error: any) => {
}
);
If the files you want to allow the user to download are public, then the best option is send (from your backend) the array of files urls to the angular application, (in case of images to create the proper from frontend)
If you want to download the image using node, you can read the file (fs.createReadStream) and send the proper header before perform the "send". Take a look into Nodejs send file in response it is a really good answer
In the end, my personal recommendation is "don't send files using node", you can use nginx to send static content

Categories

Resources