How can I send all image files from my backend nodejs server folder to my Reactjs client? I have created a web upload site where each user can signin and upload their files. So whenever a user signs-in, I want all the files he/she has uploaded to be visible on website.
res.sendFile didn't help. I found out that it does not send multiple files at once.
So far it is only sending 1 single file (console log shows all the files) that is visible on the client side.
Nodejs:
function getFiles (dir, files_){
files_ = files_ || [];
var files = fs.readdirSync(dir);
for (var i in files){
var name = dir + '/' + files[i];
if (fs.statSync(name).isDirectory()){
getFiles(name, files_);
} else {
files_.push(name);
}
}
return files_;
}
app.get('/loadit', verifyToken, (req, res) => {
var loadFiles = getFiles(__dirname + /data/);
jwt.verify(req.token, 'secretkey', (err, decoded) => {
if(err) {
res.sendStatus(403);
} else {
loadFiles.map((data1) => {
console.log(data1);
return res.sendFile(data1)
})
}
})
});
Is there any different approach for doing the same task? I also thought about sending all the images link as a json list to the frontend (reactjs) and then requesting images link from my nodejs server. I don't know if that is a good idea at all.
You can generate an archive file (like zip) on the fly at the server to download all images, e.g. with https://github.com/archiverjs/node-zip-stream to make a zip file with all images.
If you want to show all images, add an API to get a list of filenames and an other one to get a specific image file.
You can never send more than one file at a time, if you want to send all the images of the user, you should send a json array with all of his images, and on the frontend fetch them one by one.
Related
So I'm completely lost at this point. I have had a mixture of success and failure but I can't for the life of me get this working. So I'm building up a zip file and storing it in a folder structure that's based on uploadRequestIds and that all works fine. I'm fairly new to the node but all I want is to take the file that was built up which is completely valid and works if you open it once it's been constructed in the backend and then send that on to the client.
const prepareRequestForDownload = (dirToStoreRequestData, requestId) => {
const output = fs.createWriteStream(dirToStoreRequestData + `/Content-${requestId}.zip`);
const zip = archiver('zip', { zlib: { level: 9 } });
output.on('close', () => { console.log('archiver has been finalized.'); });
zip.on('error', (err) => { throw err; });
zip.pipe(output);
zip.directory(dirToStoreRequestData, false);
zip.finalize();
}
This is My function that builds up a zip file from all the files in a given directory and then stores it in said directory.
all I thought I would need to do is set some headers to have an attachment disposition type and create a read stream of the zip file into the res.send function and then react would be able to save the content. but that just doesn't seem to be the case. How should this be handled on both the API side from reading the zip and sending to the react side of receiving the response and the file auto-downloading/requesting a user saves the file.
This is what the temp structure looks like
There is some strategies to resolve it, all browser when you redirect to URL where extension ending with .zip, normally start downloading. What you can do is to return to your client the path for download something like that.
http://api.site.com.br/my-file.zip
and then you can use:
window.open('URL here','_blank')
I am trying to download a file from outside of my root directory however every time I try, it tries to take it from the root directory. I will need the user of my site to be able to download these files.
The file has initially been uploaded to Amazon S3 and I have accessed it using the getObject function.
Here is my code:
app.get('/test_script_api', function(req, res){
var fileName = req.query.File;
s3.getObject(
{ Bucket: "bucket-name", Key: fileName },
function(error, s3data){
if(error != null){
console.log("Failed to retrieve an object: " + error);
}else{
//I have tried passing the S3 data but it asks for a string
res.download(s3data.Body);
//So I have tried just passing the file name & an absolute path
res.download(fileName);
}
}
);
});
This returns the following error:
Error: ENOENT: no such file or directory, stat '/home/ec2-user/environment/test2.txt'
When I enter an absolute path it just appends this onto the end of /home/ec2-user/environment/
How can I change the directory res.download is trying to download from?
Is there an easier way to download your files from Amazon S3?
Any help would be much appreciated here!
I had the same problem and I found this answer:
NodeJS How do I Download a file to disk from an aws s3 bucket?
Based on that, you need to use createReadStream() and pipe().
R here more about stream.pipe() - https://nodejs.org/en/knowledge/advanced/streams/how-to-use-stream-pipe/
res.attachment() will set the headers for you.
-> https://expressjs.com/en/api.html#res.attachment.
This code should work for you (based on the answer in the above link):
app.get('/test_script_api', function (req, res) {
var fileName = req.query.File;
res.attachment(fileName);
var file = s3.getObject({
Bucket: "bucket-name",
Key: fileName
}).createReadStream()
.on("error", error => {
});
file.pipe(res);
});
In my case, on the client side, I used
This made sure that the file is downloading.
I would upload an audio file to azure blob storage.
First I make a http request to url to get the file .
Then I would "directly" save it to azure blob storage without the need to store it in server then upload it.
Here's my code:
request
.get(url, {
auth: {
bearer: token
}
})
.on("response", async function(response) {
const res = await blobService.createAppendBlobFromStream(
"recordings", // container name
"record.wav",
response, // should be stream
177777, // stream length
function(err, res) {
try {
console.log(res);
} catch (err) {
console.log(err);
}
}
);
});
Actually when I upload a file in blob storage and check my database I get an empty file with no data inside, I think I'm not sending the data stream correcty.
What I expect is to get the audio file in blob storage with data inside that I get from the get request
I should also specify the stream length but I don't know how to get it I putted a random number but it should be the right stream length. I have checked response object but I havn't found that infomation.
I think you can't upload a file directly so first of all create a folder in your blob storage then create a file.
Read the selected file data and write it into the created file using file stream or else.
here is the upload file code,
app.post('/upload',function(req,res){
if(req.files.fileInput){
var file = req.files.fileInput,
name = file.name,
type = file.mimetype;
var uploadpath = __dirname + '/uploads/' + name;
file.mv(uploadpath,function(err){
if(err){res.send("Error Occured!")}
else {res.send('Done! Uploading files')}
});
}
else {
res.send("No File selected !");
res.end();
};
})
I use the following code to read the get files from the file system
The code is from a blog post called Building a File Uploader with NodeJs.
I was able to see the UI, etc when I ran my project.
I cannot use the following code since I don't have an uploads folder in my project (form.uploadDir)
app.post('/upload', function(req, res){
// create an incoming form object
var form = new formidable.IncomingForm();
// specify that we want to allow the user to upload multiple files in a single request
form.multiples = true;
// store all uploads in the /uploads directory - cannot use it
//form.uploadDir = path.join(__dirname, '/uploads');
// every time a file has been uploaded successfully,
// rename it to it's original name
form.on('file', function(field, file) {
fs.rename(file.path, path.join(form.uploadDir, file.name));
});
// log any errors that occur
form.on('error', function(err) {
console.log('An error has occurred: \n' + err);
});
// once all the files have been uploaded, send a response to the client
form.on('end', function() {
res.end('success');
});
// parse the incoming request containing the form data
form.parse(req);
});
My question is how should I get the file from the UI with the code
above? I need to get the file content from the form when I choose my file.
The application is deployed to the cloud and when I use localhost, I use the following code (which works)
const readStream = fs.createReadStream("./file.html");
...
const writeStream = sftp.createWriteStream("/index.html");
...
readStream.pipe(writeStream);
Which creates a file from the file system with the correct path and overwrites it with another file (like here index.html).
As #aedneo said, when the user choose file the file is created in the upload folder , I just needed to rename it and give the right path to the write stream method and it works!
Hello I am trying to send a file from my express server to my front end to allow the user to download it when they click a button. I've created a route on the backend the sends over the byte array but I dont have the slightest idea of how to allow the user to download it.
what I'm trying to do here is allow one user to upload a file directly to my server then another user comes and downloads the file that was uploaded on a previous date.
i've gotten the upload part to work fine. now I'm just missing direction for the download part.
here is my express route being hit.
const path = require('path');
const getFile = (req, res) => {
const filePath = path.join(__dirname, '../../../', req.query.file);
console.log(filePath);
res.download(filePath);
};
here is my onClick for my frontEnd function
download(file) {
axios.get('/api/download/getFile', {
params: {
file,
},
})
.then(data =>{
console.log(data);
window.open(data.data);
})
}
and this is the error i keep on getting when i click the button
Unable to open a window with invalid URL
make sense to me because I'm not getting a url I'm getting a byte array
Try this -
window.open('/api/download/getFile?file=yourfile');