So I'm completely lost at this point. I have had a mixture of success and failure but I can't for the life of me get this working. So I'm building up a zip file and storing it in a folder structure that's based on uploadRequestIds and that all works fine. I'm fairly new to the node but all I want is to take the file that was built up which is completely valid and works if you open it once it's been constructed in the backend and then send that on to the client.
const prepareRequestForDownload = (dirToStoreRequestData, requestId) => {
const output = fs.createWriteStream(dirToStoreRequestData + `/Content-${requestId}.zip`);
const zip = archiver('zip', { zlib: { level: 9 } });
output.on('close', () => { console.log('archiver has been finalized.'); });
zip.on('error', (err) => { throw err; });
zip.pipe(output);
zip.directory(dirToStoreRequestData, false);
zip.finalize();
}
This is My function that builds up a zip file from all the files in a given directory and then stores it in said directory.
all I thought I would need to do is set some headers to have an attachment disposition type and create a read stream of the zip file into the res.send function and then react would be able to save the content. but that just doesn't seem to be the case. How should this be handled on both the API side from reading the zip and sending to the react side of receiving the response and the file auto-downloading/requesting a user saves the file.
This is what the temp structure looks like
There is some strategies to resolve it, all browser when you redirect to URL where extension ending with .zip, normally start downloading. What you can do is to return to your client the path for download something like that.
http://api.site.com.br/my-file.zip
and then you can use:
window.open('URL here','_blank')
Related
The Next.Js/React application I'm working on utilizes Firebase's cloud storage to store .doc/.docx/.pdf files. I want to be able to dynamically change the suggested file name in the browser document viewer on download, however I can only get it to work sometimes. Since I want to keep the original file name the same, I cannot permanently change the metadata in cloud storage either.
I have found that requesting a signed url from cloud storage and adding a responseDisposition property only works if the original file name doesn't include a '.pdf' or '.docx' in the title.
Here is my server handler code that requests the signed url and sends it back to the client:
const {firebaseInit} = require('../../firebase-admin-init');
const fetchResumeLink = async (req,res) => {
const {documentPath, dynamicName} = req.body;
const bucket = firebaseInit.storage().bucket();
const file = bucket.file(documentPath);
const today = new Date();
const tomorrow = new Date();
tomorrow.setDate(today.getDate()+1);
const config = {
action: 'read',
responseDisposition: `attachment; filename=Resume_for_${dynamicName}.pdf`,
expires: tomorrow
}
file.getSignedUrl(config, (err, url) => {
if (err) {
console.error(err);
res.status(500).send(err);
} else {
res.setHeader('Content-Type', 'application/pdf')
res.status(200).send(url);
}
});
}
This method only works in Chrome if the original file is contained at a storage path like /bucket/folder/obj but if it is at /bucket/folder/obj.pdf it doesn't seem to work anymore. On Mozilla I ran across an instance where the tab displayed the correct file name but when prompted to download the file the original file name was the original one.
Does anyone know why this happens? Is there anyway to get the browser document readers to not ignore the content-disposition headers?
Also open to any other methods to dynamically generate a file's saved name.
If a ContentDisposition header is set on the object, it overrides the response-content-disposition query parameter.
So my guess is you're using a library or tool that sets the ContentDisposition on the metadata when you upload objects with certain extensions (like PDF) known to need non-inline display.
How can I send all image files from my backend nodejs server folder to my Reactjs client? I have created a web upload site where each user can signin and upload their files. So whenever a user signs-in, I want all the files he/she has uploaded to be visible on website.
res.sendFile didn't help. I found out that it does not send multiple files at once.
So far it is only sending 1 single file (console log shows all the files) that is visible on the client side.
Nodejs:
function getFiles (dir, files_){
files_ = files_ || [];
var files = fs.readdirSync(dir);
for (var i in files){
var name = dir + '/' + files[i];
if (fs.statSync(name).isDirectory()){
getFiles(name, files_);
} else {
files_.push(name);
}
}
return files_;
}
app.get('/loadit', verifyToken, (req, res) => {
var loadFiles = getFiles(__dirname + /data/);
jwt.verify(req.token, 'secretkey', (err, decoded) => {
if(err) {
res.sendStatus(403);
} else {
loadFiles.map((data1) => {
console.log(data1);
return res.sendFile(data1)
})
}
})
});
Is there any different approach for doing the same task? I also thought about sending all the images link as a json list to the frontend (reactjs) and then requesting images link from my nodejs server. I don't know if that is a good idea at all.
You can generate an archive file (like zip) on the fly at the server to download all images, e.g. with https://github.com/archiverjs/node-zip-stream to make a zip file with all images.
If you want to show all images, add an API to get a list of filenames and an other one to get a specific image file.
You can never send more than one file at a time, if you want to send all the images of the user, you should send a json array with all of his images, and on the frontend fetch them one by one.
I use the following code to read the get files from the file system
The code is from a blog post called Building a File Uploader with NodeJs.
I was able to see the UI, etc when I ran my project.
I cannot use the following code since I don't have an uploads folder in my project (form.uploadDir)
app.post('/upload', function(req, res){
// create an incoming form object
var form = new formidable.IncomingForm();
// specify that we want to allow the user to upload multiple files in a single request
form.multiples = true;
// store all uploads in the /uploads directory - cannot use it
//form.uploadDir = path.join(__dirname, '/uploads');
// every time a file has been uploaded successfully,
// rename it to it's original name
form.on('file', function(field, file) {
fs.rename(file.path, path.join(form.uploadDir, file.name));
});
// log any errors that occur
form.on('error', function(err) {
console.log('An error has occurred: \n' + err);
});
// once all the files have been uploaded, send a response to the client
form.on('end', function() {
res.end('success');
});
// parse the incoming request containing the form data
form.parse(req);
});
My question is how should I get the file from the UI with the code
above? I need to get the file content from the form when I choose my file.
The application is deployed to the cloud and when I use localhost, I use the following code (which works)
const readStream = fs.createReadStream("./file.html");
...
const writeStream = sftp.createWriteStream("/index.html");
...
readStream.pipe(writeStream);
Which creates a file from the file system with the correct path and overwrites it with another file (like here index.html).
As #aedneo said, when the user choose file the file is created in the upload folder , I just needed to rename it and give the right path to the write stream method and it works!
Hello I am trying to send a file from my express server to my front end to allow the user to download it when they click a button. I've created a route on the backend the sends over the byte array but I dont have the slightest idea of how to allow the user to download it.
what I'm trying to do here is allow one user to upload a file directly to my server then another user comes and downloads the file that was uploaded on a previous date.
i've gotten the upload part to work fine. now I'm just missing direction for the download part.
here is my express route being hit.
const path = require('path');
const getFile = (req, res) => {
const filePath = path.join(__dirname, '../../../', req.query.file);
console.log(filePath);
res.download(filePath);
};
here is my onClick for my frontEnd function
download(file) {
axios.get('/api/download/getFile', {
params: {
file,
},
})
.then(data =>{
console.log(data);
window.open(data.data);
})
}
and this is the error i keep on getting when i click the button
Unable to open a window with invalid URL
make sense to me because I'm not getting a url I'm getting a byte array
Try this -
window.open('/api/download/getFile?file=yourfile');
I'm trying to get ready to move a node.js application I've been working on into a production environment (I'm using Heroku). Users add images to the site via url. Right now they are just saved on the server- I'd like to move the storage to s3 but am having some difficulties.
The idea is to save the image to disk first and then upload it, but I'm struggling to find a way to be notified when the file has finished writing to the disk. It seems like it doesn't use the typical node callback style.
Here is the code as it is now. I'm using the request node module which may be complicating things rather than simplifying them:
requestModule(request.payload.url).pipe(fs.createWriteStream("./public/img/submittedContent/" + name));
// what do I do here?
fs.readFile("./public/img/submittedContent/" + name, function(err, data){
if (err) { console.warn(err); }
else {
s3.putObject({
Bucket: "submitted_images",
Key: name,
Body: data
}).done(function(s3response){
console.log("success!");
reply({message:'success'});
}).fail(function(s3response){
console.log("failure");
reply({message:'failure'});
});
}
});
Any advice would be appreciated. Thanks!
Try listening for the finish event on the writable stream:
requestModule(request.payload.url).pipe(fs.createWriteStream("./public/img/submittedContent/" + name)).on('finish', function(){
// do stuff with saved file
});
Unless you're modifying the image, you shouldn't upload to Heroku - but rather directly to S3.
See Direct to S3 File Uploads in Node.js