I would upload an audio file to azure blob storage.
First I make a http request to url to get the file .
Then I would "directly" save it to azure blob storage without the need to store it in server then upload it.
Here's my code:
request
.get(url, {
auth: {
bearer: token
}
})
.on("response", async function(response) {
const res = await blobService.createAppendBlobFromStream(
"recordings", // container name
"record.wav",
response, // should be stream
177777, // stream length
function(err, res) {
try {
console.log(res);
} catch (err) {
console.log(err);
}
}
);
});
Actually when I upload a file in blob storage and check my database I get an empty file with no data inside, I think I'm not sending the data stream correcty.
What I expect is to get the audio file in blob storage with data inside that I get from the get request
I should also specify the stream length but I don't know how to get it I putted a random number but it should be the right stream length. I have checked response object but I havn't found that infomation.
I think you can't upload a file directly so first of all create a folder in your blob storage then create a file.
Read the selected file data and write it into the created file using file stream or else.
here is the upload file code,
app.post('/upload',function(req,res){
if(req.files.fileInput){
var file = req.files.fileInput,
name = file.name,
type = file.mimetype;
var uploadpath = __dirname + '/uploads/' + name;
file.mv(uploadpath,function(err){
if(err){res.send("Error Occured!")}
else {res.send('Done! Uploading files')}
});
}
else {
res.send("No File selected !");
res.end();
};
})
Related
I have Angular running on the FrontEnd and Firebase Admin SDK for Node.js on the BackEnd.
What I want to achieve is to allow the user to select an image from his computer, using a simple <input> of type file. When I receive the user image which is of type File on the Angular side, I want to send this to my Node.js server and let him upload it to the Firebase Storage.
Here's how I'm sending the image to Node.js:
method(imageInput): void {
const image: File = imageInput.files[0];
const reader = new FileReader();
reader.addEventListener('load', (event: any) => {
const imageData = {
source: event.target.result,
file: image
}
this.myService.uploadImage(imageData.file).subscribe(
(res) => {
// image sent successfully
},
(err) => {
// error
})
});
reader.readAsDataURL(image);
}
So on the Node.js side I don't see a way to upload this image.
I'm trying:
admin.storage().bucket().upload(imageFromAngular, { --> Here's the problem
destination: "someDestination/",
contentType: "image/png",
metadata: {
contentType: "image/png"
}
}).then(() => {
// send successful response
}).catch(err => {
// send error response
});
The issue comes from the fact that the upload method only takes as a parameter the path to the image and the options. However in this case I can't pass the path to the image, rather I can pass the image itself. I read this - https://googleapis.dev/nodejs/storage/latest/ but I couldn't find anything that would suit my needs.
What would be the correct way to do this ?
Update:
Here's a more detailed explanation to the approach I took:
I'm using the arrayBuffer method of the image File inside my method. This method returns a promise of type ArrayBuffer. I get the value and send it to my Server.
The Server uses Buffer.from(ArrayBuffer, 'base64') to convert the data and then I can safely use the save API (https://googleapis.dev/nodejs/storage/latest/File.html#save).
To get the image later on I use download - (https://googleapis.dev/nodejs/storage/latest/File.html#download).
You can write a byte stream (or a Buffer) to Cloud Storage.
createWriteStream() API for streaming data to Cloud Storage: https://googleapis.dev/nodejs/storage/latest/File.html#createWriteStream
save() API for writing buffered data to Cloud Storage: https://googleapis.dev/nodejs/storage/latest/File.html#save
What is a proper way to upload an external image via URL into strapi on backend-side?
I had tried to load image with node-fetch and processed it with buffer()/blob()/blob().stream() and then passed it into strapi.plugins['upload'].services.upload.upload(). Also tried to generate FormData in node.js and passed it into upload service but still didn't help.
How to convert image buffer from fetch into suitable type for upload service?
I used axios and it was on client, but you can try in on server too I think.
This worked for me:
Fetch an image and create File instance from it
async getImage(imageUrl, imageName) {
const response = await axios.get(imageUrl, { responseType: 'blob' });
const mimeType = response.headers['content-type'];
const imageFile = new File([response.data], imageName, { type: mimeType });
return imageFile;
}
GraphQL API query
{
query: `
mutation($files: [Upload!]!) {
multipleUpload(files: $files) {
id
}
}
`,
variables: {
files: [
// your files to upload
]
}
}
Then I called this mutation and it worked perfect.
Resources that I used to find this solution:
https://www.freecodecamp.org/news/how-to-manage-file-uploads-in-graphql-mutations-using-apollo-graphene-b48ed6a6498c/
Client side convert png file stream into file object
https://github.com/jaydenseric/graphql-multipart-request-spec
I'm trying to make an endpoint in NodeJS/Express for downloading content from my AWS S3 Bucket.
It works well, I can download the file in the client side but I can also see the stream preview in the Network tab which is annoying...
QUESTION
I'm wondering if what I'm doing is correct and a good practice.
Also would like to know if it's normal to see the output stream in the Network tab.
How should I properly send I file from S3 to my client application using NodeJS/Express?
I'm pretty sure other websites requests don't let you preview the content with a: "Fail to load response data".
This is what I do in my NodeJS application to get the stream file from AWS S3:
download(fileId) {
const fileObjectStream = app.s3
.getObject({
Key: fileId
})
.createReadStream();
this.res.set("Content-Type", "application/octet-stream");
this.res.set(
"Content-Disposition",
'attachment; filename="' + fileId + '"'
);
fileObjectStream.pipe(this.res);
}
And in the client side I can see this:
I think the issue is with the header :
//this line will set proper header for file and make it downloadable in client's browser
res.attachment(key);
// this will execute download
s3.getObject(bucketParams)
.createReadStream()
.pipe(res);
So code should be like this (This is what I am doing it in my project handling file as res.attachment or res.json in case of error so client can display error to end user) :
router.route("/downloadFile").get((req, res) => {
const query = req.query; //param from client
const key = query.key;//param from client
const bucketName = query.bucket//param from client
var bucketParams = {
Bucket: bucketName,
Key: key
};
//I assume you are using AWS SDK
s3 = new AWS.S3({ apiVersion: "2006-03-01" });
s3.getObject(bucketParams, function(err, data) {
if (err) {
// cannot get file, err = AWS error response,
// return json to client
return res.json({
success: false,
error: err
});
} else {
res.attachment(key); //sets correct header (fixes your issue )
//if all is fine, bucket and file exist, it will return file to client
s3.getObject(bucketParams)
.createReadStream()
.pipe(res);
}
});
});
I am trying to upload an image to S3 but I keep finding that phone images have rotated images and I learned this is due to EXIF data. I found this library called graphicsmagick which purports to be able to get rid of EXIF data and I also decided I wanted to reduce the images sizes to 500px wide and whatever height results.
The issue I can't figure out is how to grab the file after it's been changed. It seems that all of graphicsmagick examples show writing the image to a file on disk, but I want to grab the file data and upload it to AWS S3.
So far I have:
let file_extension = file.name.split('.').pop();//grab the file extension of for saving in the db
let key = `${user_id}/${UUID()}.${file_extension}`; //create a unique key to save in S3 based on users id
let params = {Bucket: S3_name, Key: key, Body: file.data};
//resize image
let new_image = gm(file.data)
.resize(500)
.noProfile()
.write() <-- this is as far as I got.
//upload
let result = new Promise(resolve=>{
s3.upload(params, function(err, result){
if (err) {
throw new Error('Could not upload photo');
}else {
resolve(result);
}
})
});
result = await result;
From gm docs:
Image output
write - writes the processed image data to the specified filename
stream - provides a ReadableStream with the processed image data
toBuffer - returns the image as a Buffer instead of a stream
So in your case instead of .write() you can use:
.toBuffer('png',function (err, buffer) {
if (err) return throw err;
console.log('done!');
})
Now you got buffer which can be used to as Body to upload to S3 logic
I have successfully uploaded files to Firebase's storage via Google Cloud Storage through JS! What I noticed is that unlike files uploaded directly, the files uploaded through Google Cloud only have a Storage Location URL, which isn't a full URL, which means it cannot be read! I'm wondering if there is a way to generate a full URL on upload for the "Download URL" part of Firebase's actual storage.
Code being used:
var filename = image.substring(image.lastIndexOf("/") + 1).split("?")[0];
var gcs = gcloud.storage();
var bucket = gcs.bucket('bucket-name-here.appspot.com');
request(image).pipe(bucket.file('photos/' + filename).createWriteStream(
{metadata: {contentType: 'image/jpeg'}}))
.on('error', function(err) {})
.on('finish', function() {
console.log(imagealt);
});
When using the GCloud client, you want to use getSignedUrl() to download the file, like so:
bucket.file('photos/' + filename).getSignedUrl({
action: 'read',
expires: '03-17-2025'
}, function(err, url) {
if (err) {
console.error(err);
return;
}
// The file is now available to read from this URL.
request(url, function(err, resp) {
// resp.statusCode = 200
});
});
You can either:
a) Create a download url through the firebase console
b) if you attempt to get the downloadurl programmatically from a firebase client, one will be created on the fly for you.