I am uploading files to s3 in this way.
This is the server code:
const params = {
Bucket: `${process.env.AWS_FOLDER_NAME}/${bucketName}`,
Key: fileName,
ContentType: fileType,
ACL:'public-read'
}
const signedUrl = s3.getSignedUrl('putObject', params)
return signedUrl
This is the client code:
const {signedUrl, err} = await response.json()
await fetch(signedUrl, {
method: 'PUT',
body: file.body,
headers: {
'Content-Type': file.fileType,
'x-amz-acl': 'public-read',
}})
.catch(err => {
console.log(err)
})
Here file.body is reader.result from this code
const reader = new FileReader()
reader.onload = () => resolve(reader.result)
reader.onerror = reject;
reader.readAsArrayBuffer(file);
All this works very well! But yesterday I uploaded a .tgs file in this way (it's a telegram sticker) and when I try to download it from s3, I don't download the file itself.
Instead this, s3 give me the .gz archive inside which my original .tgs file lies. Moreover, this is a file inside the .gz archive without the .tgs extension.
The file itself in s3 looks quite normal
I want s3 to just give me the file. How should I do it?
tgs file is actually a gz file, you can just rename it from AnimatedSticker.gz to AnimatedSticker.tgs
Related
I am testing my application, and I want to send a local mp3 file that I have stored in my storage folder.
How can I send this via Axios? Should I convert it to a Blob/File, and if so, how?
Thanks in advance.
This code below is how I tried it, but it didn't work.
const formData = new FormData();
const file = new File('storage/app/audio/debug.mp3', 'test.mp3', {
type: Blob,
lastModified: Date.now()
});
formData.append('data', file);
formData.append('id', urlId[2]);
return Axios.post("http://voice-app.test/api/v1/file", formData, {
headers: {
'Authorization': `Bearer ${localStorage.getItem("Bearer")}`,
}
}).catch((e) => {
console.error(e);
})
I am trying to upload multiple files/folder from front-end and for some reason In my back-end I want information like - File Name, File Extension, Location of File in the Uploaded Folder and so on..
Now I am able to upload multiple files/folder but my problem comes when I am looping all the selected files to append in FormData(), I also want to append information like - File Name, File Extension, Location of File in the Uploaded Folder, for individual file so that I can identify in my back-end.
this is what I am doing----:
const uploadFolder = document.querySelector('#upload-folder');
uploadFolder.addEventListener('change', async ()=>{
const url = "/home/photos/my-photos/";
let files = uploadFolder.files;
let header = {'Accept': 'application/json', 'X-Requested-With': 'XMLHttpRequest', 'X-CSRFToken': window.CSRF_TOKEN}
let formData = new FormData()
for (const key in files) {
const file = files[key];
if (file.name && file.webkitRelativePath){
const fileExtension = file.name.split(".")[(file.name.split(".").length)-1]
const filePath = file.webkitRelativePath.replace(file.name, "")
formData.append(`file-${key}`, file);
// Here I also want to append fileExtension, filePath, and also name of the file.
// I tried by giving formData.append(`file-${key}`, file, fileExtension, filePath, file.name);
// but unfortunatly it didn't work
}
}
await fetch("/home/photos/my-photos/", {method: "POST", mode: 'same-origin',
headers: header,
body: formData
})
})
You need to call formData.append() for each key you want to add.
For example:
formData.append(`file-${key}-path`, filePath)
I'm trying to update a local JPG image file into an S3 bucket using the REST PUT request and Axios.
I managed to send the PUT request and to get a positive answer from AWS S3 Service but what it's been upload is not a JPG file but a JSON file.
This is the code that I'm using:
//Create the FormData
var data = new FormData();
data.append('file', fs.createReadStream(image_path));
//Send the file to File-system
console.log("Sending file to S3...");
const axiosResponse = await axios.put(image_signed_url, {
data: data,
headers: { 'Content-Type': 'multipart/form-data' }
}).catch(function(error) {
console.log(JSON.stringify(error));
return null;
});
I have already try to change the headers to {'Content-Type': 'application/octet-stream' } but I obtained the same result.
It did not manage to make AXIOS work in order to upload an image.
The node-fetch module did it sending the image as a binary and specifying the "Content-type".
If I try to the same using AXIOS it the image was always packet into a form-data and the result was JSON file uploaded into the S3 bucket instead of the image.
//Send the file to File-system
console.log("Sending file to S3...");
const resp = await fetch(image_signed_url, {
method: 'PUT',
body: fs.readFileSync(image_path),
headers: {
'Content-Type': 'image/jpeg',
},
}).catch( err => {
console.log(err);
return null;
});
I'm building a React app, and one of the components is creating a FormData object with two fields - one is a file and the other is a string. The FormData is being sent as a PUT method to an express route, and its type is multipart/form-data because there's a file to upload. There I need to get the string valie from the form (called path), then use multer/multer-s3 to upload the file to AWS S3 to the specified path.
I'm stuck on how to do this. I wasn't able to find an answer on how to retrieve a text field from a multipart request. I did see How to Post "multipart/form-data" Form and get the Text field values from the Node.js server? and several similar suggestions, however the suggested answer did not work for me.
On the body-parser website, it says "This does not handle multipart bodies" - so that also means that bodyParser.urlencoded wouldn't help in my case.
React component:
const path = `/somepathhere/`;
const formData = new FormData();
formData.append('file', file);
formData.append('path', path)
// call API handler
apiActions.js:
const response = await fetch(apiUrl, {
method: "PUT",
body: formData
})
server.js:
// other code taken out for brevity
// middleware
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: false }));
// other code taken out for brevity
app.put('/api/uploadToS3', (req, res) => {
const path = '';
const storage = multerS3({
s3: s3,
bucket: process.env.AWS_S3_BUCKET,
contentType: multerS3.AUTO_CONTENT_TYPE,
cacheControl: 'max-age=31536000',
metadata: function (req, file, cb) {
cb(null, {fieldName: file.fieldname});
},
key: function (req, file, cb) {
cb(null, path + file.originalname)
}
})
const upload = multer({
storage: storage
}).any();
upload(req,res,function(err) {
if(err) {
console.log(err);
return res.end("Error uploading file.");
} else {
res.end("File has been uploaded");
}
});
});
If there is a better way to handle this, I would also like to know. The end goal is that the file should be uploaded to an S3 path dynamically determined in the React app. I tried sending an object with formData and path like such:
const response = await fetch(apiUrl, {
method: "PUT",
body: {formData: formData, path: path}
})
But that gave the error: [ERR_HTTP_HEADERS_SENT]: Cannot set headers after they are sent to the client. Trying to add headers to the fetch request likewise gave various errors.
I created a download button that downloads the file on click. The problem is when I click the download button, I'm able to see the content that I want to download by using Chrome inspect -> Network -> Response but it is not opening a window to save the file to my PC.
For example, I'm trying to download text.txt which contains multiple lines of MY FILE string. I'm able to see it on Response tab but how can I download the .txt file.
Relevant React Code:
<button onClick={(e) => downloadHandler(e)}>Download</button>
let downloadHandler = (e) =>{
const fileName = e.target.parentElement.id;
fetch('http://localhost:3001/download',{
method: 'post',
headers: {'Content-Type': 'application/json'},
body: JSON.stringify({id: fileName})
})
}
Relevant Backend Code:
const uploadFolder = `${__dirname}` + `\\uploads\\`;
app.post('/download', function(req, res){
let fileName = req.body.id; // example output: 'dog.jpg'
res.download(uploadFolder+fileName, fileName); // Set disposition and send it.
});
The idea is that I will feed fileName to backend and then download it with res.download(uploadFolder+fileName, fileName); line. I think im suppose to use window.open('/download') but that just opens the homepage on a new tab or maybe I am just placing it wrong.
Okay, I have managed to solve my issue. three main modifications were made to make this code and idea work.
Changing the request from POST to GET. Another
StackOverflow thread also mentions this.
Using axios() instead of fetch().
Creating Blob object from the res.download(filePath, fileName) response value.
Anyone else having this problem with the React Code part should check this Github link.
Final State of the React function posted in the question
let downloadHandler = (e) =>{
const fileName = e.target.parentElement.id;
axios({
method: 'get',
url: 'http://localhost:3001/download/'+fileName,
responseType: 'blob',
headers: {},
})
.then((res) => {
const url = window.URL.createObjectURL(new Blob([res.data]));
const link = document.createElement('a');
link.href = url;
link.setAttribute('download', fileName);
document.body.appendChild(link);
link.click();
})
.catch((error) => {
alert(error);
})
}
Final State of the backend code posted in the question
const uploadFolder = `${__dirname}` + `\\uploads\\`;
app.get('/download/:filename', function(req, res){
let fileName = req.params.filename
const filePath = uploadFolder+fileName;
res.download(filePath, fileName);
});