Resizing images with sharp before uploading to google cloud storage - javascript

I tried to resize or compress an image before uploading to the google cloud storage.
The upload works fine but the resizing does not seem to work.
Here is my code:
const uploadImage = async (file) => new Promise((resolve, reject) => {
let { originalname, buffer } = file
sharp(buffer)
.resize(1800, 948)
.toFormat("jpeg")
.jpeg({ quality: 80 })
.toBuffer()
const blob = bucket.file(originalname.replace(/ /g, "_"))
const blobStream = blob.createWriteStream({
resumable: false
})
blobStream.on('finish', () => {
const publicUrl = format(
`https://storage.googleapis.com/${bucket.name}/${blob.name}`
)
resolve(publicUrl)
}).on('error', () => {
reject(`Unable to upload image, something went wrong`)
})
.end(buffer)
})

I ran into the same issue with a project I was working on. After lots of trial and error I found the following solution. It might not be the most elegant, but it worked for me.
In my upload route function I created a new thumbnail image object with the original file values and passed it as the file parameter to the uploadFile function for google cloud storage.
Inside my upload image route function:
const file = req.file;
const thumbnail = {
fieldname: file.fieldname,
originalname: `thumbnail_${file.originalname}`,
encoding: file.encoding,
mimetype: file.mimetype,
buffer: await sharp(file.buffer).resize({ width: 150 }).toBuffer()
}
const uploadThumbnail = await uploadFile(thumbnail);
My google cloud storage upload file function:
const uploadFile = async (file) => new Promise((resolve, reject) => {
const gcsname = file.originalname;
const bucketFile = bucket.file(gcsname);
const stream = bucketFile.createWriteStream({
resumable: false,
metadata: {
contentType: file.mimetype
}
});
stream.on('error', (err) => {
reject(err);
});
stream.on('finish', (res) => {
resolve({
name: gcsname
});
});
stream.end(file.buffer);
});

I think the problem is with toFormat(). That function does not exist in the Docs. Can you try to remove it and check if it would work?
sharp(buffer)
.resize(1800, 948)
.jpeg({ quality: 80 })
.toBuffer()

Modify the metadata once you have finished uploading the image.
import * as admin from "firebase-admin";
import * as functions from "firebase-functions";
import { log } from "firebase-functions/logger";
import * as sharp from "sharp";
export const uploadFile = functions.https.onCall(async (data, context) => {
const bytes = data.imageData;
const bucket = admin.storage().bucket();
const buffer = Buffer.from(bytes, "base64");
const bufferSharp = await sharp(buffer)
.png()
.resize({ width: 500 })
.toBuffer();
const nombre = "IMAGE_NAME.png";
const fileName = `img/${nombre}.png`;
const fileUpload = bucket.file(fileName);
const uploadStream = fileUpload.createWriteStream();
uploadStream.on("error", async (err) => {
log("Error uploading image", err);
throw new functions.https.HttpsError("unknown", "Error uploading image");
});
uploadStream.on("finish", async () => {
await fileUpload.setMetadata({ contentType: "image/png" });
log("Upload success");
});
uploadStream.end(bufferSharp);
});

Related

consume s3 getobjectcommand result in angular (open as pdf)

I am trying to open a file from a s3 bucket using angular as a pdf. To do this, I have a node service running which gets the object, which I call from angular. Then I'm trying to open in angular as a pdf. Is there something I am missing? When I open the PDF, it shows up as a blank (white) document.
Below is my node code:
const streamToString = (stream) =>
new Promise((resolve, reject) => {
const chunks = [];
stream.on("data", (chunk) => chunks.push(chunk));
stream.on("error", reject);
stream.on("end", () => resolve(Buffer.concat(chunks).toString("utf8")));
});
const readFile = async function getObj(key) {
const params = {
Bucket: vBucket,
Key: key,
};
const command = new GetObjectCommand(params);
const response = await client.send(command);
const { Body } = response;
return streamToString(Body);
};
And here I am consuming in angular and opening as PDF
The service:
getObj(key: String): Observable<any>{
const httpOptions = {
'responseType' : 'arraybuffer' as 'json'
//'responseType' : 'blob' as 'json' //This also worked
};
return this.http.get<any>(environment.s3Ep + '/getfile?key=' + key, httpOptions );
}
And code consuming the service:
this.s3Svc.getObj(key).subscribe(
res => {
let file = new Blob([res], {type: 'application/pdf'});
var fileURL = URL.createObjectURL(file);
window.open(fileURL);
}
);
I started experiencing the same issue. Found a solution, replacing streamToString with streamToBuffer as follows:
const streamToBuffer = async (stream: Readable): Promise<Buffer> => {
return new Promise((resolve, reject) => {
const chunks: Array<any> = []
stream.on('data', (chunk) => chunks.push(chunk))
stream.on('error', reject)
stream.on('end', () => resolve(Buffer.concat(chunks)))
})
}
and the code that consumes it:
const command = new GetObjectCommand({ Bucket, Key })
const data = await s3.send(command)
const content = await streamToBuffer(data.Body as Readable)
fs.writeFileSync(destPath, content)
In my case I'm writing to a PDF file.
Writing as a string retrieved from streamToString or writing it as buffer.toString() resulted in the blank PDF.

multi file upload with skipper-better-s3 and sailjs returns the same key

As seen in the title, I am currently using sailjs + skipper-better-s3 for s3 upload. Started with uploading one file which works great, then because change request the need of multi-file upload at once so I added a for loop but by doing this, all keys will be the same and ended up the only one file is uploaded which is the last uploaded file but with the first upload filename.
I did read some articles and people are saying something like The problem is because for loop is synchronous and file upload is asynchronous and people saying the result of this is using a recursion which I tried too but no luck though, the same thing happens.
My recursive code is below...
s3_upload_multi: async (req, res) => {
const generatePath = (rootPath, fieldName) => {
let path;
// this is just a switch statement here to check which fieldName is provided then value of path will depend on it
// as for the other two variable is just checking if upload content type is correct
return { path };
};
const processUpload = async ({
fieldName,
awsOp,
fileExtension,
rootPath,
fileName,
}) => {
return new Promise(function (resolve, reject) {
req.file(fieldName).upload(awsOp, async (err, filesUploaded) => {
if (err) reject(err);
const filesUploadedF = filesUploaded[0]; // F = first file
const response = {
status: true,
errCode: 200,
msg: 'OK',
response: {
url: filesUploadedF.extra.Location,
size: filesUploadedF.size,
type: fileExtension,
filename: filesUploadedF.filename,
key: filesUploadedF.extra.Key,
field: fieldName,
}
};
resolve(response);
});
});
}
const process_recur = async (files, fieldName) => {
if (files.length <= 0) return;
const fileUpload = files[0].stream;
const rootPath = `${sails.config.aws.upload.path.root}`;
const fileCType = fileUpload.headers['content-type'];
// console.log(fileCType, 'fileCType');
const { path } = generatePath(rootPath, fieldName);
const fileName = fileUpload.filename;
const fileExtension = fileUpload.filename.split('.').pop();
const genRan = await UtilsService.genRan(8);
const fullPath = `${path}${genRan}-${fileName}`;
const awsOp = {
adapter: require('skipper-better-s3'),
key: sails.config.aws.access_key,
secret: sails.config.aws.secret_key,
saveAs: fullPath,
bucket: sails.config.aws.bucket,
s3params: {
ACL: 'public-read'
},
};
const config = {
fieldName,
awsOp,
fileExtension,
rootPath,
fileName,
}
const procceed = await processUpload(config);
files.shift();
await process_recur(files, fieldName);
};
try {
const fieldName = req._fileparser.upstreams[0].fieldName;
const files = req.file(fieldName)._files;
await process_recur(files, fieldName);
} catch (e) {
console.log(e, 'inside UploadService');
return false;
}
}
below is the code for me using for loop which is quite similiar from above though
s3_upload_multi: async (req, res) => {
const generatePath = (rootPath, fieldName) => {
let path;
// this is just a switch statement here to check which fieldName is provided then value of path will depend on it
// as for the other two variable is just checking if upload content type is correct
return { path };
};
const processUpload = async ({
fieldName,
awsOp,
fileExtension,
rootPath,
fileName,
}) => {
return new Promise(function (resolve, reject) {
req.file(fieldName).upload(awsOp, async (err, filesUploaded) => {
if (err) reject(err);
const filesUploadedF = filesUploaded[0]; // F = first file
const response = {
status: true,
errCode: 200,
msg: 'OK',
response: {
url: filesUploadedF.extra.Location,
size: filesUploadedF.size,
type: fileExtension,
filename: filesUploadedF.filename,
key: filesUploadedF.extra.Key,
field: fieldName,
}
};
resolve(response);
});
});
}
try {
const fieldName = req._fileparser.upstreams[0].fieldName;
const files = req.file(fieldName)._files;
for (const file of files) {
const fileUpload = file.stream;
const rootPath = `${sails.config.aws.upload.path.root}`;
const fileCType = fileUpload.headers['content-type'];
// console.log(fileCType, 'fileCType');
const fileName = fileUpload.filename;
const { path } = generatePath(rootPath, fieldName);
const fileExtension = fileUpload.filename.split('.').pop();
// using a variable here because if this is an image, a thumbnail will be created with the same name as the original one
const genRan = await UtilsService.genRan(8);
const fullPath = await `${path}${genRan}-${fileName}`;
const awsOp = {
adapter: require('skipper-better-s3'),
key: sails.config.aws.access_key,
secret: sails.config.aws.secret_key,
saveAs: fullPath,
bucket: sails.config.aws.bucket,
s3params: {
ACL: 'public-read'
},
};
const config = {
fieldName,
awsOp,
fileExtension,
rootPath,
fileName,
}
const procceed = await processUpload(config);
console.log(procceed, 'procceed');
}
} catch (e) {
console.log(e, 'inside UploadService');
return false;
}
}
Which part am I making mistake that's causing such behavior? I checked my path it's totally correct with correct filename too when I console.log
Thanks in advance for any suggestions and help.
Took me quite a lot of time to figure this out ages ago.
Especially you are using skipper-better-s3 which did not conclude as much detailed documentation as skipper, going back to look into skipper documentation actually the saveAs field doesn't only take string but also a function which you can then use that to get each file's filename and return it as needed so actually you do not even need to use neither resursive or for loop at all.
for example with some of your codes
const awsOp = {
adapter: require('skipper-better-s3'),
key: sails.config.aws.access_key,
secret: sails.config.aws.secret_key,
saveAs: (__newFileStream, next) => {
// generatePath is what you wrote
// __newFileStream.filename would the filename of each each before uploading
// the path is pretty much the s3 key which includes your filename too
const { path } = generatePath(rootPath, __newFileStream.filename, fieldName);
return next(undefined, path);
},
bucket: sails.config.aws.bucket,
s3params: {
ACL: 'public-read'
},
};
skipper documentation https://www.npmjs.com/package/skipper#customizing-at-rest-filenames-for-uploads

Uploading files with graphql to mongodb with mongoose

I want to upload file to mongodb with graphql resolver.
In server.js I have this help function to store file, which is exported to use it in my resolver.
The function is basing on what I found here: https://github.com/jaydenseric/graphql-upload/issues/8), but now some things have changed in graphql. For example destructurising file object. I don't know what should be found at path variable and how should I use this createReadStream(function which was destructurized from file).
const mongoose = require('mongoose');
const Grid = require('gridfs-stream');
const fs = require('fs');
//...
// Connect to Mongo
mongoose
.connect(process.env.mongoURI, {
useNewUrlParser: true,
useCreateIndex: true,
useUnifiedTopology: true,
useFindAndModify: false
}) // Adding new mongo url parser
.then(() => console.log('MongoDB Connected...'))
.catch(err => console.log(err));
const storeFile = async (upload) => {
const { filename, createReadStream, mimetype } = await upload.then(result => result);
const bucket = new mongoose.mongo.GridFSBucket(mongoose.connection.db, { bucketName: 'files' });
const uploadStream = bucket.openUploadStream(filename, {
contentType: mimetype
});
createReadStream()
.pipe(uploadStream)
.on('error', console.log('error'))
.on('finish', console.log('finish'));
}
module.exports = { storeFile }
//...
My resolver(here it's minimal version, because now I want only to upload file into my database. In one of my tries, it even created fs.files and fs.chunks collections, but without a data):
Mutation: {
uploadFile: async (_, { file }) => {
console.log(file);
const fileId = await storeFile(file);
return true;
}
}
I have this error now:
Unhandled Rejection (Error): GraphQL error: The "listener" argument
must be of type function. Received undefined
and in terminal I have printed 'error'(like in pipe.on('error', console.log('error') statement )
And I can upload only small files( max 60 kb), all larger just don't upload, but errors are showing on all tries.
Ok, I managed to solve it.
resolver mutation:
const { storeFile } = require('../../server');
//...
uploadFile: async (_, { file }) => {
const fileId = await storeFile(file).then(result => result);
return true;
// later I will return something more and create some object etc.
}
supporting function from server.js
const storeFile = async (upload) => {
const { filename, createReadStream, mimetype } = await upload.then(result => result);
const bucket = new mongoose.mongo.GridFSBucket(mongoose.connection.db, { bucketName: 'files' });
const uploadStream = bucket.openUploadStream(filename, {
contentType: mimetype
});
return new Promise((resolve, reject) => {
createReadStream()
.pipe(uploadStream)
.on('error', reject)
.on('finish', () => {
resolve(uploadStream.id)
})
})
}
module.exports = { storeFile }

How to solve "File name too long" while uploading local Image to Firestore and Firebase Storage in React Native?

I'm trying to upload local images to Cloud Firestore and Firebase storage.
The PACKAGES I'm using are
react-native-image-picker to choose an image from local
rn-fetch-blob to transfer files as blob to Firebase storage
Here's the 2 functions I'm using:
export const uploadPost = async (postImageUri) => {
return new Promise(async(res, rej) => {
const fs = RNFetchBlob.fs;
fs.readFile(postImageUri,'hello','base64')
.then(async (data) => {
console.log('data from RNFetchBlob', data);
const storageRef = storage().ref(`posts/images`).child(`${data}`);
try {
storageRef.putFile(data,{contentType: 'image/jpeg'})
.on(
storage.TaskEvent.STATE_CHANGED,
snapshot => {
console.log('snapshot', snapshot.state);
console.log('progress', (snapshot.bytesTransferred)/(snapshot.totalBytes)*100);
if(snapshot.state === storage.TaskState.SUCCESS){
console.log('SUCCESS')
}
},
error => {
console.log('Image upload error', error);
rej(error)
},
() => {
storageRef.getDownloadURL()
.then((downLoadUri) => {
console.log('File available at: ', downLoadUri);
// addPostInfo
res(downLoadUri)
})
}
)
}catch{err => console.log(err) }
})
} )
}
export const addPostInfo = async (post) => {
console.log('post.postImageUri in addPostInfo', post.postImageUri.slice(10))
const remoteUri = await uploadPost(post.postImageUri);
return new Promise((res, rej) => {
firestore()
.collection('posts')
.add({
id: post.id,
createdAt: post.date,
description: post.description,
image: remoteUri,
location: post.checkin_location
})
.then(ref => {
console.log('Finish adding in addPostInfo', ref)
res(ref)
})
.catch(err => rej(err))
})
}
And in the main screen, I directly send the image's path to the addPostInfo:
const _onSharingPost = () => {
const postCreated = {
id: 'alkdgjhfa;oiughaoghus',
postImageUri: postImageUri,
date: _getCurrentDate(),
description: postDescription,
checkin_location: checkInLocation
}
postActions.addPostInfo(postCreated);
}
The path is like this:
content://com.google.android.apps.photos.contentprovider/0/1/content%3A%2F%2Fmedia%2Fexternal%2Fimages%2Fmedia%2F58/ORIGINAL/NONE/image%2Fjpeg/950379202
ERROR
...base64 string --> File name too long
It got down to the SUCCESS part of the upLoadPost function but then still got error
I HAVE TRIED:
Change the path --> cut off the content://
Use XMLHttpRequest to send the blob to the upLoadPost to send that to the storage
As far as I know, the problem must be the base64 string converted from the image path. I know we need to change that to blob or something, but I don't know how. I can't find where this can be specified in rn-fetch-blob docs
PLEASE HELP ME

upload multiple files to firebase storage issues

i am working on a cooking android app where firebase is the backend, i need to upload multiple images of a recipe in firebase stoarge and then store the downloadurl into firebase database.
i managed to upload the files into firebase but i am having some trouble to get the downloadUrl of these files.
i have created an array of promises to upload the files and i had created another array to store the url of each file which i get when it finishes the uploading task.
here is my code
var promises = [];
for (var i=0 ;i< prepaImages.length;i++)
{
//alert(prepaImages[i].name);
var storageRef = firebase.storage().ref("receipes"+"/"+category+"/"+title+"/"+uploadTime+prepaImages[i].name );
var uploadTask = storageRef.put(prepaImages[i]);
promises.push(uploadTask);
uploadTask.on('state_changed', snapshot => {
var percentage = (snapshot.bytesTransferred / snapshot.totalBytes) * 100;
$("#prepaImageUploadprogress").html(Math.round(percentage)+"%");
$("#prepaImageUploadprogress").attr("style", "width:"+percentage+"%");
}, error => { alert(error) }, () => {
uploadTask.snapshot.ref.getDownloadURL().then(downloadURL => {
//prepaImagesUrl+="+"+downloadURL;
prepaImagesUrl.push(downloadURL);
});
});
the problem is i am getting an array of the length of the number of uploaded files minus one (the legnth it should be equal to the number of uploaded files) and it has the same value (the same downloadurl)
. any help will be appreciated
Thank you.
I think the problem is with the promisies. I suggest you to use Promise.all and await. Therefore your code will be more reliable. Here is my solution to multiple file upload (adapt to your variable names):
const array = Array.from({ length: prepaImages.length }, (value, index) => index);
const uploadedImages = await Promise.all(array.map(async index => {
const image = prepaImages[index];
const metadata = { contentType: image.type };
const storageRef = firebase.storage().ref(`receipes/${category}/${title}/${uploadTime}${prepaImages[i].name}`);
const uploadTask = storageRef.put(image, metadata);
const url = await new Promise((resolve, reject) => {
uploadTask.on('state_changed', snapshot => {
const percentage = (snapshot.bytesTransferred / snapshot.totalBytes) * 100;
$('#prepaImageUploadprogress').html(`${Math.round(percentage)}%`);
$('#prepaImageUploadprogress').attr('style', `width: ${percentage}%`);
}, error => reject(error),
async () => {
const downloadUrl = await uploadTask.snapshot.ref.getDownloadURL();
resolve(downloadUrl);
});
});
return { name: image.name, url };
}));
The uploadedImages will contains an array with the image names and download urls. You can make this without await of course, but I prefer this way.
UPDATE:
Here is my own code (without error handling) to achieve this, also, I need to mention that I'm using this with react, redux and using the firebase, firestore wrapper for redux redux-firestore and react-redux-firebase but these are just wrappers:
export const addNewWork = work => async (dispatch, getState, { getFirebase, getFirestore }) => {
const { files, ...restWork } = work;
const firebase = getFirebase();
const firestore = getFirestore();
const storageRef = firebase.storage().ref();
const array = Array.from({ length: files.length }, (value, index) => index);
const uploadedFiles = await Promise.all(array.map(async index => {
const file = files[index];
const metadata = { contentType: file.type };
const uploadTask = storageRef.child(`works/${file.name}`).put(file, metadata);
const url = await new Promise((resolve, reject) => {
uploadTask.on('state_changed', () => {}, error => reject(error), async () => {
const downloadUrl = await uploadTask.snapshot.ref.getDownloadURL();
resolve(downloadUrl);
});
});
return { name: file.name, url };
}));
await firestore.collection('works').add({
...restWork,
image: uploadedFiles[0], // Use only one image for the clean example
createdAt: new Date()
});
});

Categories

Resources