I'm building an Ionic app with Angular and Firebase.
I want to be able to upload an image and gif to my firebase database, but I've only been able to get image to work. Also, I don't want videos.
My code is as follows:
takePhoto(sourceType:number) {
const options: CameraOptions = {
quality: 40,
destinationType: this.camera.DestinationType.DATA_URL,
encodingType: this.camera.EncodingType.JPEG,
mediaType: this.camera.MediaType.PICTURE,
correctOrientation: true,
sourceType:sourceType,
}
this.camera.getPicture(options).then((imageData) => {
let base64Image = 'data:image/jpeg;base64,' + imageData;
this.uploadToStorage(base64Image);
}, (err) => {
// Handle error
});
}
uploadToStorage(src) {
this.uploadProgress = true;
let storageRef = firebase.storage().ref();
// Create a timestamp as filename
this.imageFileName = Math.floor(Date.now() / 1000) + "_" + this.userData.uid;
// Create a reference to 'images/todays-date.jpg'
const imageRef = storageRef.child('posts/'+this.imageFileName+'.jpg');
imageRef.putString(src, firebase.storage.StringFormat.DATA_URL).then((snapshot)=> {
snapshot.ref.getDownloadURL().then(downloadURL => {
this.imageURL = downloadURL;
this.uploadProgress = false;
this.uploadSuccess = true;
console.log(this.imageURL)
this.logEvent("Uploaded Image");
});
}, (err) => {
console.log(err)
});
}
But this only allows still images. According to the docs for the Ionic Camera
you can change mediaType: this.camera.MediaType.PICTURE to mediaType: this.camera.MediaType.ALLMEDIA but that doesn't work for me. It works when I'm testing on my computer, but not on iOS or Android.
Any ideas how I can allow images and gifs from being selected? Thank you!
photos ;
OpenGallery(){
console.log("taktit");
const options: CameraOptions = {
quality: 100,
destinationType: this.camera.DestinationType.DATA_URL,
sourceType: this.camera.PictureSourceType.SAVEDPHOTOALBUM
}
this.camera.getPicture(options).then((imageData) => {
// imageData is either a base64 encoded string or a file URI
// If it's base64:
console.log("taktit");
let base64Image = 'data:image/jpeg;base64,' + imageData;
this.photos.push(base64Image);
this.photos.reverse();
}, (err) => {
// Handle error
}); else { }
}
Now you can uplode your object photos to your firebase
Related
I tried to resize or compress an image before uploading to the google cloud storage.
The upload works fine but the resizing does not seem to work.
Here is my code:
const uploadImage = async (file) => new Promise((resolve, reject) => {
let { originalname, buffer } = file
sharp(buffer)
.resize(1800, 948)
.toFormat("jpeg")
.jpeg({ quality: 80 })
.toBuffer()
const blob = bucket.file(originalname.replace(/ /g, "_"))
const blobStream = blob.createWriteStream({
resumable: false
})
blobStream.on('finish', () => {
const publicUrl = format(
`https://storage.googleapis.com/${bucket.name}/${blob.name}`
)
resolve(publicUrl)
}).on('error', () => {
reject(`Unable to upload image, something went wrong`)
})
.end(buffer)
})
I ran into the same issue with a project I was working on. After lots of trial and error I found the following solution. It might not be the most elegant, but it worked for me.
In my upload route function I created a new thumbnail image object with the original file values and passed it as the file parameter to the uploadFile function for google cloud storage.
Inside my upload image route function:
const file = req.file;
const thumbnail = {
fieldname: file.fieldname,
originalname: `thumbnail_${file.originalname}`,
encoding: file.encoding,
mimetype: file.mimetype,
buffer: await sharp(file.buffer).resize({ width: 150 }).toBuffer()
}
const uploadThumbnail = await uploadFile(thumbnail);
My google cloud storage upload file function:
const uploadFile = async (file) => new Promise((resolve, reject) => {
const gcsname = file.originalname;
const bucketFile = bucket.file(gcsname);
const stream = bucketFile.createWriteStream({
resumable: false,
metadata: {
contentType: file.mimetype
}
});
stream.on('error', (err) => {
reject(err);
});
stream.on('finish', (res) => {
resolve({
name: gcsname
});
});
stream.end(file.buffer);
});
I think the problem is with toFormat(). That function does not exist in the Docs. Can you try to remove it and check if it would work?
sharp(buffer)
.resize(1800, 948)
.jpeg({ quality: 80 })
.toBuffer()
Modify the metadata once you have finished uploading the image.
import * as admin from "firebase-admin";
import * as functions from "firebase-functions";
import { log } from "firebase-functions/logger";
import * as sharp from "sharp";
export const uploadFile = functions.https.onCall(async (data, context) => {
const bytes = data.imageData;
const bucket = admin.storage().bucket();
const buffer = Buffer.from(bytes, "base64");
const bufferSharp = await sharp(buffer)
.png()
.resize({ width: 500 })
.toBuffer();
const nombre = "IMAGE_NAME.png";
const fileName = `img/${nombre}.png`;
const fileUpload = bucket.file(fileName);
const uploadStream = fileUpload.createWriteStream();
uploadStream.on("error", async (err) => {
log("Error uploading image", err);
throw new functions.https.HttpsError("unknown", "Error uploading image");
});
uploadStream.on("finish", async () => {
await fileUpload.setMetadata({ contentType: "image/png" });
log("Upload success");
});
uploadStream.end(bufferSharp);
});
Hello I have already read and attempted the thread: Saving desktopCapturer to video file in Electron
This is what I have so far:
const { desktopCapturer } = require('electron')
var fs = require('fs');
function startRecording(){
desktopCapturer.getSources({ types: ['window', 'screen'] }).then(async sources => {
for (const source of sources) {
if (source.name === 'Entire Screen') {
try {
const stream = await navigator.mediaDevices.getUserMedia({
audio: false,
video: {
mandatory: {
chromeMediaSource: 'desktop',
chromeMediaSourceId: source.id,
minWidth: 1280,
maxWidth: 1280,
minHeight: 720,
maxHeight: 720
}
}
})
handleStream(stream)
} catch (e) {
handleError(e)
}
return
}
}})
}
function handleStream(stream) {
recorder = new MediaRecorder(stream);
blobs = [];
recorder.ondataavailable = function(event) {
blobs.push(event.data);
};
recorder.start();
}
function stopRecording() {
recorder.stop();
console.log(blobs);
toArrayBuffer(new Blob(blobs, {type: 'video/webm'}), function(ab) {
var buffer = toBuffer(ab);
var file = `./videos/example.webm`;
fs.writeFile(file, buffer, function(err) {
if (err) {
console.error('Failed to save video ' + err);
} else {
console.log('Saved video: ' + file);
}
});
});
}
function handleUserMediaError(e) {
console.error('handleUserMediaError', e);
}
function toArrayBuffer(blob, cb) {
let fileReader = new FileReader();
fileReader.onload = function() {
let arrayBuffer = this.result;
cb(arrayBuffer);
};
fileReader.readAsArrayBuffer(blob);
}
function toBuffer(ab) {
let buffer = Buffer.alloc(ab.byteLength);
let arr = new Uint8Array(ab);
for (let i = 0; i < arr.byteLength; i++) {
buffer[i] = arr[i];
}
return buffer;
}
// Record for 3.5 seconds and save to disk
startRecording();
setTimeout(function() { stopRecording() }, 3500)
I just want to save the recorded video to a file.
The file ends up being empty when it saves. I have been stuck at this awhile and would appreciate any advice. Thank you
Welp I stumbled upon this thread: Saving desktopCapturer to video file from Electron app
and changed my stopRecording() function to :
function stopRecording() {
const save = () =>{
toArrayBuffer(new Blob(blobs, {type: 'video/webm'}), function(ab) {
var buffer = toBuffer(ab);
var file = `./videos/example.webm`;
fs.writeFile(file, buffer, function(err) {
if (err) {
console.error('Failed to save video ' + err);
} else {
console.log('Saved video: ' + file);
}
});
});
}
recorder.onstop = save;
recorder.stop();
}
and it seems to be working. Cool!
Heyy, i am trying to upload a cropped image to firebase.
I would prefer to use the ionic native "image-picker" and "Crop".
I really dont know how to upload the image after cropping it, because it only returns the path of the new image.
I have already tryed something like this. This worked, but i was not able to crop the image. But as i mentioned, i would prefer using the native tools anyways.
export interface UploadData {
name: string;
filepath: string;
size: number;
}
uploadFile(event: FileList) {
// The File object
const file = event.item(0);
// Validation for Images Only
if (file.type.split('/')[0] !== 'image') {
console.error('unsupported file');
return;
}
// The storage path
const path = `whatever/${new Date().getTime()}_${file.name}`;
// File reference
const fileRef = this.storage.ref(path);
// The main task
this.task = this.storage.upload(path, file, { customMetadata });
this.snapshot = this.task.snapshotChanges().pipe(
finalize(() => {
// Get uploaded file storage path
this.UploadedFileURL = fileRef.getDownloadURL();
this.UploadedFileURL.subscribe(resp => {
this.addImagetoDB({
name: file.name,
filepath: resp,
size: this.fileSize
});
}, error => {
console.error(error);
});
}),
tap(snap => {
this.fileSize = snap.totalBytes;
})
);
}
addImagetoDB(image: UploadData) {
const id = this.db.createId();
// Set document id with value in database
this.imageCollection.doc(id).set(image).then(resp => {
console.log(resp);
}).catch(error => {
console.log('error ' + error);
});
}
}
This is how i would like to do it. But i really have no idea, how to upload it at this point.
pickImage() {
this.imagePicker.getPictures(this.imagePickerOptions).then((results)
=> {
// tslint:disable-next-line: prefer-for-of
for (let i = 0; i < results.length; i++) {
this.cropImage(results[i]);
}
}, (err) => {
alert(err);
});
}
cropImage(imgPath) {
this.crop.crop(imgPath, { quality: 50 })
.then(
newPath => {
// ?????
},
error => {
alert('Error cropping image' + error);
}
);
}
Sorry, i am very new to this stuff.
Thanks for your help :)
It seems that you might be able to do this without the crop feature being needed.
These are the options according to the docs:
options = {
// Android only. Max images to be selected, defaults to 15. If this is set to 1, upon
// selection of a single image, the plugin will return it.
maximumImagesCount: int,
// max width and height to allow the images to be. Will keep aspect
// ratio no matter what. So if both are 800, the returned image
// will be at most 800 pixels wide and 800 pixels tall. If the width is
// 800 and height 0 the image will be 800 pixels wide if the source
// is at least that wide.
width: int,
height: int,
// quality of resized image, defaults to 100
quality: int (0-100),
// output type, defaults to FILE_URIs.
// available options are
// window.imagePicker.OutputType.FILE_URI (0) or
// window.imagePicker.OutputType.BASE64_STRING (1)
outputType: int
};
So you could use:
options = {
maximumImagesCount: 3,
width: 800,
height: 600,
quality: 50,
outputType: 1
};
From what I've been researching you could then put the image into Firebase Storage using:
storageRef.putString("Your base64 string substring variable", 'base64');
I'm not sure if this is enough to get you fixed up but I thought I would post what I had found anyway.
I just tried this, but it dosn´t work too. I have no idea why...
constructor(private imagePicker: ImagePicker, private crop: Crop,
private file: File) {
let storageDb = firebase.storage();
this.storageRef = storageDb.ref();
}
pickImage() {
this.imagePicker.getPictures(this.imagePickerOptions).then((results)
=> {
// tslint:disable-next-line: prefer-for-of
for (let i = 0; i < results.length; i++) {
this.cropImage(results[i]);
}
}, (err) => {
alert(err);
});
}
cropImage(imgPath) {
this.crop.crop(imgPath, { quality: 50 })
.then(
newPath => {
try {
let n = newPath.lastIndexOf("/");
let x = newPath.lastIndexOf("g");
let nameFile = newPath.substring(n + 1, x + 1);
this.file.readAsArrayBuffer(newPath, nameFile).then((res) => {
let blob = new Blob([res], { type: "image/jpeg" });
var uploadTask = this.storageRef.child('images/' + this.event.id).put(blob);
uploadTask.on('state_changed', (snapshot) => {
let url = uploadTask.snapshot.downloadURL;
this.croppedImagepath = url;
}, (error) => {
alert("error: " + error);
}, () => {
alert("uploaded");
let url = uploadTask.snapshot.downloadURL;
this.croppedImagepath = url;
})
})
}
catch (z) {
alert('error beim erstellen des blobs' + z);
}
},
error => {
alert('Error cropping image' + error);
}
);
}
I'm trying to upload to my Storage bucket and then get the downloadURL to that uploaded file right after the upload is done. This was working previously but has since stopped for some reason.
My console print is just returning null. I was hoping for a solution or even a better way of doing this. Any help would be awesome!
I'm using Angular 5.
Here is my current method:
upload(event) {
this.showProgressBar = true;
const randomId = Math.random().toString(36).substring(2);
this.ref = this.afStorage.ref(randomId);
this.uploadTask = this.ref.put(event.target.files[0]);
this.uploadProgress = this.uploadTask.percentageChanges().subscribe(progress => {
console.log(progress);
document.querySelector('#progressBar').style.width = progress + "%";
if(progress === 100){
this.showProgressBar = false;
this.showUploaded = true;
this.downloadURL = this.uploadTask.downloadURL().subscribe(url => {
console.log(url);
});
}
});
}
Here is the way that I coded using angularfire 2 for the file uploading process.
public selectedFile: FileList;
chooseFile(event) {
this.selectedFile = event.target.files;
}
uploadImage() {
const file = this.selectedFile.item(0);
const key = 'uploads/' + '/' + Math.floor(Math.random() * 1000000) + file.name;
const upload = this.stroge.upload(key, file).then(() => {
const ref = this.stroge.ref(key);
const downloadURL = ref.getDownloadURL().subscribe(url => {
this.Updateprofile(url);
});
});
}
Here is the part of the component I’m using in an Ionic app to upload up to 5 pics
uploadPicture(i) {
let that=this;
this.cameraPlugin.getPicture({
quality: 100,
destinationType: this.cameraPlugin.DestinationType.DATA_URL,
sourceType: this.cameraPlugin.PictureSourceType.CAMERA,
allowEdit: true,
encodingType: this.cameraPlugin.EncodingType.PNG,
//targetWidth: 500,
//targetHeight: 500,
saveToPhotoAlbum: true
}).then(
imageData => {
// Send the picture to Firebase Storage
const selfieRef = this.addPictureFile();
var uploadTask = selfieRef.putString(imageData, 'base64', {contentType: 'image/png'});
// Register three observers:
// 1. 'state_changed' observer, called any time the state changes
// 2. Error observer, called on failure
// 3. Completion observer, called on successful completion
that.uploading = true;
that.picsCtrl[i].buttonDisabled = true;
uploadTask.on('state_changed',
function(snapshot) {
// Get task progress, including the number of bytes uploaded and the total number of bytes to be uploaded
var progress = (uploadTask.snapshot.bytesTransferred / uploadTask.snapshot.totalBytes) * 100;
that.loadProgress = progress;
switch (uploadTask.snapshot.state) {
case firebase.storage.TaskState.PAUSED: // or 'paused'
console.log('Upload is paused');
break;
case firebase.storage.TaskState.RUNNING: // or 'running'
console.log('Upload is running');
break;
}
}, function(error) {
// Handle unsuccessful uploads
}, function() {
// Handle successful uploads on complete
// For instance, get the download URL: https://firebasestorage.googleapis.com/...
uploadTask.snapshot.ref.getDownloadURL().then(function(downloadURL) {
var imageURL = selfieRef.getDownloadURL().then(url => {
that.uploading = false;
that.picsCtrl[i].imgSrc = url;
that.picsCtrl[i].buttonHidden = !that.picsCtrl[i].buttonHidden;
that.picsCtrl[i].imgHidden = !that.picsCtrl[i].imgHidden;
that.addPictureRef(url)
.then( keyRef => {
that.picsCtrl[i].imgKey = keyRef.key;
that.picsCtrl = that.createBucket(that.picsCtrl);
});
});
});
});
},
error => {
console.log(error);
}
);
}
The relevants parts of the code are:
Using that instead of this when the stream is being uploaded
The use of several vars to show the progress bar and update the progress and the final func to get the url and do some stuff in the view
In my app there’s a component that handles the process of uploading and removing images.
I am trying to upload pictures to my server in IONIC 3, but Sometimes it functions well and sometines not. I think is for the syncrony .
The flow is :
In my Page i have button where it take the picture and save the photo( this step function good).
take picture
takePhoto() {
this.camera.getPicture({
quality: 80,
destinationType: this.camera.DestinationType.FILE_URI,
targetWidth: 150,
targetHeight: 150,
correctOrientation: true
}).then(imageUri => {
this.myPhoto = imageUri;
}).catch(error => console.warn(error))
}
Then when i have this picture and The button send photo is available , if i Click in this button , the next stept is the next :
private uploadPhoto(): void {
if (this.myPhoto != undefined || !this.myPhoto || this.myPhoto != null) {
let imageFileUri = this.myPhoto;
this.error = null;
this.loading = this.loadingCtrl.create({
content: 'Cargando imagen...'
});
this.loading.present();
this.file.resolveLocalFilesystemUrl(imageFileUri)
.then(entry => (<FileEntry>entry).file(file => this.readFile(file)))
.catch(err => {
alert(JSON.stringify(err));
});
}
else {
this.presentAlertFailText("No hay ninguna imagen", "Selecciona una e inténtalo de nuevo.")
}
}
and this function call the next, where i call to my service to upload to my server .
private readFile(file: any) {
const reader = new FileReader();
reader.onloadend = () => {
const formData = new FormData();
const imgBlob = new Blob([reader.result], {type: file.type});
formData.append('profile', imgBlob, file.name);
this._up.uploadUserPicture(formData).then(res => {
localStorage.setItem('PHOTO', res['result']);
this.loading.dismiss();
}, err => {
this.loading.dismiss();
this.presentToast('Imagen no subida');
}).catch(err => {
this.loading.dismiss();
this.presentToast('Imagen no subida');
})
};
reader.readAsArrayBuffer(file);
}
I think the problem is for the promises, for this reason sometimes function well and sometimes not.
I think i need refactor the code in the line :
.then(entry => (entry).file(file => this.readFile(file)))
in function uploadPhoto
But i don't know ..how can i do that .
Do you have any suggestion ?picture of my page with flow