How to download after uploading a file using express and multer? - javascript

I have uploaded the file in my backend filesystem using multer
My server is node and client is react.
I'm having trouble downloading and displaying the saved file on the client react
Whenever I do res.download(file) it just throws an error as connection refused on client side.
My code is as follows:
UserToUploadMapping.js
const mongoose = require("mongoose");
const UserToUploadMapping = new mongoose.Schema({
userId: {
type:String,
required:true
},
file: {
type: Object,
required: true,
},
date: {
type: Date,
default: Date.now,
},
});
module.exports = mongoose.model("UserToUploadMapping", UserToUploadMapping);
uploadVideo.js
const router = require("express").Router();
const multer = require('multer');
const UserToUploadMapping = require('../models/UserToUploadMapping')
let nameFile = ''
const storage = multer.diskStorage({
destination:'./Videos',
filename:(req,file,cb) => {
console.log(file)
nameFile = file.originalname + " "+ Date.now()
cb(null, nameFile)
}
})
const upload = multer({storage:storage})
router.post('/upload', upload.single('video'), async (req,res,next) => {
console.log("object")
const saveMapping = new UserToUploadMapping({
userId:'123',
file:req.file,
})
await saveMapping.save()
res.send("Video uploaded")
})
router.get('/download', async(req,res,next) => {
const x = await UserToUploadMapping.find()
// res.send(x)
res.download(x[0].path)
})
module.exports = router;
CLIENT
const fetchVideo = async () => {
const resp = await axios.get(
"http://localhost:5000/api/user/video/download"
);
console.log(resp)
};
return (
<>
<NavContainer />
<div className={classes.Post}>
<Input
type="file"
onChange={(e) => uploadVideos(e.target.files)}
accept="video/mp4"
/>
{/* <Button onClick={(e) => submitHandler(e)}>Upload</Button> */}
<video></video>
</div>
</>
);
Error

There is a few problems within the uploadVideo.js file :
to get the path from the data, you need to use x[0].file.path
(based on how you save the file in the database)
const saveMapping = new UserToUploadMapping({
userId:'123',
file:req.file,
})
to avoid problems about where the file uploadVideo.js is and where we run the application, you should use an absolute path when saving files in the system.
(small problem) your filename function will give filenames like this video.mp4 1622180824748. I think this is better "video-1622181268053.mp4" (we have the correct file extension)
You can refer to this code
const router = require("express").Router();
const multer = require('multer');
const UserToUploadMapping = require('../models/UserToUploadMapping')
const path = require('path');
const uploadFolder = path.join(__dirname, "Videos"); // use a variable to hold the value of upload folder
const storage = multer.diskStorage({
destination: uploadFolder, // use it when upload
filename: (req, file, cb) => {
// nameFile = file.originalname + " "+ Date.now() // --> give "video.mp4 1622180824748"
let [filename, extension] = file.originalname.split('.');
let nameFile = filename + "-" + Date.now() + "." + extension; // --> give "video-1622181268053.mp4"
cb(null, nameFile)
}
})
const upload = multer({ storage: storage })
router.post('/upload', upload.single('video'), async (req, res, next) => {
const saveMapping = new UserToUploadMapping({
userId: '123',
file: req.file,
})
await saveMapping.save()
res.send("Video uploaded")
})
router.get('/download', async (req, res, next) => {
const video = await UserToUploadMapping.find({});
res.download(video[0].file.path); // video[0].file.path is the absolute path to the file
})
module.exports = router;

Your code indicates you are handling large files (videos). I would strongly recommend looking at separation of concerns, handling this as part of your other business logic is not recommended based on my experience. This can e.g. complicate firewall rules and DDOS protection when that is needed in the future.
As a minimum, move upload and download into its own server, e.g. 'files.yourappnamehere.com' so that you can handle the specifics separately from your business logic api.
If you run in the public cloud, I would strongly recommend looking at reusing blob upload/download functionality, letting your clients upload directly to blob storage and also handling downloads directly from blob storage, e.g. in Azure, AWS or GCP.
This will save you a lot of the implementation details of handling (very) large files, and also give "free" extensibility options such as events on file upload completion.

You are running 2 apps Frontend and Backend with difference ports (3000, 5000) so browsers block cross domain requests. On Express you must enable CORS to allow request from FrontEnd Url (http://localhost:3000).

For the download route, try using window.location functions instead of using Axios.

It looks like you might have a typo in your get handler... you're referencing an element called 'path', but that's not declared in your schema
router.get('/download', async(req,res,next) => {
const x = await UserToUploadMapping.find()
// res.send(x)
res.download(x[0].path)//<-Path Doesn't seem to be in the schema
})
Since you don't have a try/catch in that function, the resulting error could be bringing down your server, making it unavailable
You might also want to take a look at this for more detail on How to download files using axios

Related

how recovery of multiple downloaded image in formData with express?

Hello sorry for my english, i have a little problem. i try to upload many images but in back side i have just one image, (i use React express formidable cloudinary) here is my code front :
const [arrayFiles, setArrayFiles] = useState([]);
const handleFiles = (e) => {
let arrayUpload = [...arrayFiles];
arrayUpload.push(e.target.files[0]);
setArrayFiles(arrayUpload);
};
const handleSubmit = async (e) => {
arrayFiles.forEach((file) => {
formData.append("image", file);
});
const response = await axios.post(
"http://localhost:3100/offer/publish",
formData
);
here is my code back but req.files => just one image
my page route :
router.post("/offer/publish", async (req, res) => {
console.log(req.files);
const result = await cloudinary.uploader.upload(req.files.image.path, {
folder: `api/leboncoin/offers/${newOffer._id}`, // _id vient de la création du newOffer au dessus
public_id: "preview",
cloud_name: process.env.CLOUDINARY_NAME,
});
my page index.js:
page index.js :
const express = require("express");
const formidable = require("express-formidable");
const mongoose = require("mongoose");
const cloudinary = require("cloudinary").v2;
const cors = require("cors");
const app = express();
app.use(cors());
app.use(formidable({ multiples: true }));
You only get one file in req.file as you've set your multer.single
Using multer
There are 3 ways you can handle multiple file upload, each with a slightly different taste.
Assume you have a base multer
const storage = multer.diskStorage({
destination: "public/data/",
filename: function(req, file, cb){
// You may change this to however you want, only affect your file name
crypto.randomBytes(20, (err, buf) => {
cb(null, buf.toString("hex") + path.extname(file.originalname))
})
}
});
const upload = multer({ storage: storage });
Use .any()
Accepts all files that comes over the wire. An array of files will be stored in req.files.
WARNING: Make sure that you always handle the files that a user uploads. Never add multer as a global middleware since a malicious user could upload files to a route that you didn't anticipate. Only use this function on routes where you are handling the uploaded files.
router.post("/offer/publish",upload.any(), async (req, res) => {
console.log(req.files); // Should give you an array of files
// Do anything else
});
Use .array(fieldname[, maxCount])
Accept an array of files, all with the name fieldname. Optionally error out if more than maxCount files are uploaded. The array of files will be stored in req.files.
router.post("/offer/publish",upload.array('someFieldName', 10), async (req, res) => {
console.log(req.files); // Should give you an array of files
// Do anything else
});
Use .fields(fields)
Accept a mix of files, specified by fields. An object with arrays of files will be stored in req.files.
fields should be an array of objects with name and optionally a maxCount. Example:
router.post(
"/offer/publish",
upload.fields([
{
name: "image",
maxCount: 1,
},
{
name: "audio",
maxCount: 1,
},
]),
async (req, res) => {
console.log(req.files.image[0]);
console.log(req.files.audio[0]);
// Do anything else
}
);
For your case, I would recommend going with Option 2.

How to access .field values on Supertest

good evening.
I'm trying to create a POST request with a file and some data on a REST API I'm building using NodeJS.
If not clear, my goal to this feature of the API is to save a register of a picture, so I'd like to send the picture file, the picture name and it's number on the same request.
I'm currently using Jest / supertest for testing and to test this specific functionality, I've tried the following:
const response = await request(app)
.post("/uploads/pics")
.field("name", "PicureName")
.field("number", "PictureNumber")
.attach("file", picture);
I've read this from https://visionmedia.github.io/superagent/#multipart-requests
My problem is that I can't get the values of name and number on my request on my controller, so I can't use them to save the object.
I've tried many ways, such as:
req.body.name
req.name
req.field.name
req.query.name
but none of these worked for me.
I also tried printing the whole request, however I couldn't find anything related to name, number or field there.
Does anyone can tell what I'm doing wrong ?
You should use https://github.com/expressjs/multer middleware for handling file upload. Then, req.body will hold the text fields, if there were any.
E.g.
index.js:
const express = require('express');
const multer = require('multer');
const app = express();
const upload = multer({ dest: 'uploads/' });
app.post('/uploads/pics', upload.single('file'), (req, res) => {
console.log(req.body);
console.log(req.file);
res.sendStatus(200);
});
module.exports = app;
index.test.js:
const request = require('supertest');
const app = require('./');
const path = require('path');
const { expect } = require('chai');
describe('62862866', () => {
it('should pass', async () => {
const picture = path.resolve(__dirname, './test.jpg');
const response = await request(app)
.post('/uploads/pics')
.field('name', 'PicureName')
.field('number', 'PictureNumber')
.attach('file', picture);
expect(response.status).to.be.eq(200);
});
});
integration test result:
62862866
[Object: null prototype] { name: 'PicureName', number: 'PictureNumber' }
{ fieldname: 'file',
originalname: 'test.jpg',
encoding: '7bit',
mimetype: 'image/jpeg',
destination: 'uploads/',
filename: '181b96eb9044aac5d50c8c1e3159a120',
path: 'uploads/181b96eb9044aac5d50c8c1e3159a120',
size: 0 }
✓ should pass (84ms)
1 passing (103ms)

How can I pass options into an imported module?

I have a utility module that creates an instance of a multer-gridfs storage engine for uploading files to my Mongo database. I use this module inside of any API route that requires the need to upload files.
I need to be able to update the metadata property value with a unique identifier. More than likely this will be the mongoose _id of the user uploading the file, but for now I am not concerned with that aspect of it. I really just want to know if I can change the metadata property dynamically.
Here is the storage engine gridFs_upload_engine.js:
const mongoose = require('mongoose');
const path = require('path');
const crypto = require('crypto');
const multer = require('multer');
const GridFsStorage = require('multer-gridfs-storage');
const Grid = require('gridfs-stream');
//Init Upload Engine
let gfs;
//Global instance of the DB connection
const database = mongoose.connection;
const mongoDb = process.env.MONGODB_URI || process.env.MLAB_URL;
database.once('open', () => {
//Init Stream
gfs = Grid(database.db, mongoose.mongo);
gfs.collection('uploads');
});
//Create Storage Engine
const storage = new GridFsStorage({
url: mongoDb,
file: (res, file) => {
return new Promise((resolve, reject) => {
crypto.randomBytes(16, (err, buf) => {
if (err) {
return reject(err);
}
const filename = buf.toString('hex') + path.extname(file.originalname);
const fileInfo = {
filename: filename,
bucketName: 'uploads',
metadata: 'NEED TO UPDATE THIS'
};
resolve(fileInfo);
});
});
}
});
const uploadEngine = multer({ storage });
module.exports = {
uploadEngine,
gfs
};
Above you can see the metadata property that I need to be able to dynamically change with some undetermined unique identifier. Is it possible to do that with an exported file?
Here is how I am utilizing it inside of an API route:
const express = require('express');
const router = express.Router();
//Controllers
const upload_controller = require('../../controllers/uploader');
//Utilities
const upload = require('../../utils/gridFs_upload_engine');
const { uploadEngine } = upload;
//Upload Single File
router.post(
'/single',
uploadEngine.single('file'),
upload_controller.upload_single_file
);
//Upload Multiple Files
//Max file uploads at once set to 30
router.post(
'/multiple',
uploadEngine.array('file', 30),
upload_controller.upload_multiple_files
);
module.exports = router;
I pass the uploadEngine into the API route here, so that the route controller can use it, and that works with no issue. I am just having quite a time trying to figure out how to update metatdata dynamically and I am leaning towards my current implementation not allowing for that.
I don't know much about node and have now idea what multer-gridfs is but I can answer How can I pass options into an imported module?
You can export an function that returns another function. And you would import it like
const configFunction = require('nameoffile')
// this returns a functions with the config you want
const doSomethingDependingOnTheConfig = configFunction({...someConfig})
And in the file you are importing you would have a function returning another function like
const configFunction = ({...someConfig}) => (your, func) => {
// do what you want deppending on the config
}
module.exports = configFunction
I know this doesn't answer your question the way you want, but answer you question title and I hope this give you a better understanding of how to do what you want to do.
If this doesn't help, just let me know.
You would need to pass a parameter to the module gridFs_upload_engine.js and do the magic there.
An example could be:
In gridFs_upload_engine.js file:
function uploadEngine (id, file) {
// update what you want
}
module.exports = {
...
uploadEngine: uploadEngine
}
In your router:
const upload = require('../../utils/gridFs_upload_engine')
...
router.post('/single/:id', function(req, res, next) {
...
upload.uploadEngine(req.params.id, file)
next()
}, upload_controller.upload_single_file)
In other words, when you are exposing gfs and uploadEngine inside your module, you could instead expose a function that would receive the arguments needed to perform the upload.

save an image to sftp remote server sent by user

I want to upload a file to sftp remote server using ssh2-sftp-client. I am taking the file from user in a post request along with destination. I am using multer to process the file.
const Client = require('ssh2-sftp-client');
const sftp = new Client();
const Multer = require("multer")
const multer = Multer({
storage: Multer.MemoryStorage
});
app.put("/sftp", multer.single('file'), (req, res) => {
sftpCredentials = req.query;
sftp.connect({
host: sftpCredentials.host,
port: sftpCredentials.port,
username: sftpCredentials.username,
password: sftpCredentials.password
}).then(res =>{
sftp.put(req.file,req.query.destination);
})
})
I am getting error :
TypeError: "string" must be a string, Buffer, or ArrayBuffer
sftp.put(localfilepath, remoteFilepath)
for localfilepath use:
req.file.path
You have used "req.file" only. If you want to get the filename too, use: req.file.originalname
Second, make sure "req.query.destination" is giving you the destination path where you want to put file.
And, do use of logging. It makes life easier.

Upload Image from Google Cloud Function to Cloud Storage

I'm attempting to handle file uploads using a Google Cloud Function. This function uses Busboy to parse the multipart form data and then upload to Google Cloud Storage.
I keep receiving the same error: ERROR: { Error: ENOENT: no such file or directory, open '/tmp/xxx.png' error when triggering the function.
The error seems to occur within the finish callback function when storage.bucket.upload(file) attempts to open the file path /tmp/xxx.png.
Note that I can't generate a signed upload URL as suggested in this question since the application invoking this is an external, non-user application. I also can't upload directly to GCS since I'll be needing to make custom filenames based on some request metadata. Should I just be using Google App Engine instead?
Function code:
const path = require('path');
const os = require('os');
const fs = require('fs');
const Busboy = require('busboy');
const Storage = require('#google-cloud/storage');
const _ = require('lodash');
const projectId = 'xxx';
const bucketName = 'xxx';
const storage = new Storage({
projectId: projectId,
});
exports.uploadFile = (req, res) => {
if (req.method === 'POST') {
const busboy = new Busboy({ headers: req.headers });
const uploads = []
const tmpdir = os.tmpdir();
busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
const filepath = path.join(tmpdir, filename)
var obj = {
path: filepath,
name: filename
}
uploads.push(obj);
var writeStream = fs.createWriteStream(obj.path);
file.pipe(writeStream);
});
busboy.on('finish', () => {
_.forEach(uploads, function(file) {
storage
.bucket(bucketName)
.upload(file.path, {name: file.name})
.then(() => {
console.log(`${file.name} uploaded to ${bucketName}.`);
})
.catch(err => {
console.error('ERROR:', err);
});
fs.unlinkSync(file.path);
})
res.end()
});
busboy.end(req.rawBody);
} else {
res.status(405).end();
}
}
I eventually gave up on using Busboy. The latest versions of Google Cloud Functions support both Python and Node 8. In node 8, I just put everything into async/await functions and it works fine.

Categories

Resources