I'm building a react native app with firebase storage and i'm trying to 1) store the image in storage, 2) retrieve downloadURL, 3) set the downloadURL string in firebase for access via the database
storage().ref('users').child(userKey + '/profileImage').put(image.path).then(() => {
storage().ref('users').child(userKey + '/profileImage').getDownloadURL().then(image => {
setProfileImage(image)
firestore().collection('users').doc(userKey).set({
profileImage: image
}).catch((e) => console.log('uploading image error1 => ', e));
}).catch((e) => console.log('uploading image error2 => ', e));
}).catch((e) => console.log('uploading image error3 => ', e));
I'm getting the following error at the first storage
call
What value and type is your image.path variable. If you look at the documentation for Reference.put you can see that it accepts Blob | Uint8Array | ArrayBuffer and I somewhat doubt that image.path is one of those types.
image.path is reference to device storage and not contains actually image data.
Assume that image.path is reference of image to device storage
like file://path/to/file.
It required to fetch image raw data either as Blob | Uint8Array | ArrayBuffer.
For Sake of simplicity, let fetch image raw data as BLOB.
const imgRef = firebase.storage().ref("");
// Indicate content type extension to encode final image -JPEG,PNG
const metadata = { contentType: "image/jpg" };
const blob = await new Promise((resolve, reject) => {
const xhr = new XMLHttpRequest();
xhr.onload = function () {
resolve(xhr.response);
};
xhr.onerror = function (e) {
reject(new TypeError("Network request failed"));
};
xhr.responseType = "blob";
xhr.open("GET", image.path, true);
xhr.send(null);
});
// BLOB data are ready, upload to remote firebase server
imgRef.put(blob, metadata);
// We're done with the blob, close and release it
blob.close();
Related
When attempting to upload a file to Amazon S3 using axios, I have been encountering a very strange issue. Normally, in a web browser, when FormData has binary data in it, the Content-Type header automatically gets set to multipart/form-data; boundary=<some random string>. However, I have been completely unable to achieve that in React Native (testing on an iOS device). The Content-Type is automatically set to application/json, and thus not being detected as a correctly formatted body when uploading to Amazon S3. I have tried specifying a blob in the file parameter in FormData instead of the URI to the file as well to no avail. I have appended my code below, any advice would be very much appreciated.
const uploadFileToS3 = (
presignedPostData,
file) => {
// create a form obj
const formData = new FormData();
// append the fields in presignedPostData in formData
Object.keys(presignedPostData.fields).forEach(
key => {
formData.append(
key,
presignedPostData.fields[key],
);
},
);
// append the file and uplaod
const getBlob = async () => {
const img_url = previewPath;
let result = await fetch(img_url);
const blob = await result.blob();
formData.append('Content-Type', 'image/jpeg');
formData.append('file', {
uri: previewPath,
type: 'image/jpeg',
name: 'test.jpeg',
});
console.log(formData, 'wild');
// post the data on the s3 url
axios
.post(presignedPostData.url, formData)
.then(function (response) {
console.log(response);
})
.catch(function (error) {
console.log(error.response);
});
};
getBlob();
};
I uploaded an image using react-native-image-picker's base64 generated data. It shows up on Firebase console fine, but when I try to look at it on my browser, it says "Error Loading Preview" and when I click on it, it is just a black box. See screenshots below
Here is my code for the uploading:
const uploadImage = async ({data, filename, uri}) => {
const ext = uri.split('.').pop();
const name = uri.split('/').pop();
const path = `${user.uid}_${name}`;
const storageRef = firebase.storage().ref(`profiles/${path}`);
storageRef
.putString(data, 'base64', {contentType: `image/${ext}`})
.then(function (snapshot) {
console.log('SUCCESSSSS');
});
};
useEffect(() => {
ImagePicker.showImagePicker((response) => {
//console.log('Response = ', response);
if (response.didCancel) {
console.log('User cancelled image picker');
} else if (response.error) {
console.log('ImagePicker Error: ', response.error);
} else if (response.customButton) {
console.log('User tapped custom button: ', response.customButton);
} else {
console.log(response.uri);
uploadImage(response);
setAvatar({uri: response.uri});
}
});
}, []);
Edit: I copied the base64 string into an online converter, looks good, it gave back the correct image. So there's nothing wrong with the data, it looks like. Something's wrong with how firebase is handling it.
Edit: I tried explicitly setting the type to image/jpeg instead of image/jpg as noted here: Proper way to show image when Firestorage is down or Error loading preview in Firestorage for iOS but no difference.
Looks like there's more than a few bugs involved with firebase's base64 putString method and react-native: see this thread: https://github.com/firebase/firebase-js-sdk/issues/576. I followed yonahforst's answer and ended up using a blob instead. Worked perfectly.
Copied here in case the thread goes away:
function urlToBlob(url) {
return new Promise((resolve, reject) => {
var xhr = new XMLHttpRequest();
xhr.onerror = reject;
xhr.onreadystatechange = () => {
if (xhr.readyState === 4) {
resolve(xhr.response);
}
};
xhr.open('GET', url);
xhr.responseType = 'blob'; // convert type
xhr.send();
})
}
Make sure to add the "data:image/jpeg;base64,9asdf92349..." if it's not there. I was using react-native-image-picker, so it didn't have that out of the box.
const dataURL = 'data:image/jpeg;base64,' + data;
urlToBlob(dataURL).then((blob) => {
storageRef
.put(blob)
.then(function (snapshot) {
const downloadURL = snapshot.ref.getDownloadURL().then((link) => {
console.log('link: ', link);
user.updateProfile({photoURL: link});
});
})
.then(() => console.log('SUCCESS'));
});
I'm using Jimp to read in a JSON string that looks like this:
As you can see the image node is a base64-encoded JPEG.
I'm able to succesfully convert it to a TIFF and save it:
Jimp.read(Buffer.from(inputImage, "base64"), function(err, image) {
image.getBuffer(Jimp.MIME_TIFF, function(error, tiff) {
context.bindings.outputBlob = tiff
...}
However, when I attempted to embed the tiff inside of a JSON object, the TIFF gets all garbled up:
const response = {
image: tiff.toString('base64'),
correlation: correlation
};
context.bindings.outputBlob = response;
Here's the full code:
const Jimp = require("jimp");
module.exports = function(context, myBlob) {
const correlation = context.bindings.inputBlob.correlation;
const inputImage = context.bindings.inputBlob.image;
const imageName = context.bindings.inputBlob.imageName;
context.log(
correlation + "Attempting to convert this image to a tiff: " + imageName
);
Jimp.read(Buffer.from(inputImage, "base64"), function(err, image) {
image.getBuffer(Jimp.MIME_TIFF, function(error, tiff) {
const response = {
image: tiff.toString('base64'),
correlation: correlation
};
context.bindings.outputBlob = response;
context.log(
correlation + "Succesfully converted " + imageName + " to tiff."
);
context.done();
});
});
};
How do we embed the tiff inside of a JSON payload?
If this output is non-negotiable, how would I render the tiff from the saved payload?
Well since you confirmed you are looking for output with context.res here is my working sample.. note that there is a maximum response size, so you can't return every image/file the way I am returning the image here
const Jimp = require('jimp')
module.exports = async function (context, req)
{
let response = {}
try
{
let url = 'https://noahwriting.com/wp-content/uploads/2018/06/APPLE-300x286.jpg'
//call function to download and resize image
response = await resizeImage(url)
}
catch (err)
{
response.type = 'application/json'
if (err.response == undefined)
{
context.log(err)
response.status = 500
response.data = err
}
else
{
response.data = err.response.data
response.status = err.response.status
context.log(response)
}
}
//response
context.res =
{
headers: { 'Content-Type': `${response.type}` },
body: response.buf
}
}
async function resizeImage(url)
{
//read image to buffer
let image = await Jimp.read(url)
//resize image
image.resize(300, Jimp.AUTO)
//save to buffer
let image_buf = await image.getBufferAsync(image.getMIME())
//image.getMIME() returns something like `image/jpeg` which is a valid Content-Type for responses.
return { 'buf': image_buf, 'type': image.getMIME() }
}
(Offtopic but I saw that you are using blob storage so..) if you plan on storing photos/files/anything in Azure Blob Storage and you want to retrieve them in some systematic way you will find out very fast that you can't query the storage and you have to deal with ugly XML. My work around to avoid this way to create a function that stores photos/files in Blob Storage but then saves the url path to the file along with the file name and any other attributes to a mongo storage. So then I can make super fast queries to retrieve an array of links, which point to the respective files.
Using Node.js, I am trying to get an image from a URL and upload that image to another service without saving image to disk. I have the following code that works when saving the file to disk and using fs to create a readablestream. But as I am doing this as a cron job on a read-only file system (webtask.io) I'd want to achieve the same result without saving the file to disk temporarily. Shouldn't that be possible?
request(image.Url)
.pipe(
fs
.createWriteStream(image.Id)
.on('finish', () => {
client.assets
.upload('image', fs.createReadStream(image.Id))
.then(imageAsset => {
resolve(imageAsset)
})
})
)
Do you have any suggestions of how to achieve this without saving the file to disk? The upload client will take the following
client.asset.upload(type: 'file' | image', body: File | Blob | Buffer | NodeStream, options = {}): Promise<AssetDocument>
Thanks!
How about passing the buffer down to the upload function? Since as per your statement it'll accept a buffer.
As a side note... This will keep it in memory for the duration of the method execution, so if you call this numerous times you might run out of resources.
request.get(url, function (res) {
var data = [];
res.on('data', function(chunk) {
data.push(chunk);
}).on('end', function() {
var buffer = Buffer.concat(data);
// Pass the buffer
client.asset.upload(type: 'buffer', body: buffer);
});
});
I tried some various libraries and it turns out that node-fetch provides a way to return a buffer. So this code works:
fetch(image.Url)
.then(res => res.buffer())
.then(buffer => client.assets
.upload('image', buffer, {filename: image.Id}))
.then(imageAsset => {
resolve(imageAsset)
})
well I know it has been a few years since the question was originally asked, but I have encountered this problem now, and since I didn't find an answer with a comprehensive example I made one myself.
i'm assuming that the file path is a valid URL and that the end of it is the file name, I need to pass an apikey to this API endpoint, and a successful upload sends me back a response with a token.
I'm using node-fetch and form-data as dependencies.
const fetch = require('node-fetch');
const FormData = require('form-data');
const secretKey = 'secretKey';
const downloadAndUploadFile = async (filePath) => {
const fileName = new URL(filePath).pathname.split("/").pop();
const endpoint = `the-upload-endpoint-url`;
const formData = new FormData();
let jsonResponse = null;
try {
const download = await fetch(filePath);
const buffer = await download.buffer();
if (!buffer) {
console.log('file not found', filePath);
return null;
}
formData.append('file', buffer, fileName);
const response = await fetch(endpoint, {
method: 'POST', body: formData, headers: {
...formData.getHeaders(),
"Authorization": `Bearer ${secretKey}`,
},
});
jsonResponse = await response.json();
} catch (error) {
console.log('error on file upload', error);
}
return jsonResponse ? jsonResponse.token : null;
}
When I drop a file in the upload area, the React-dropzone returns an object such as:
let picture = [
{
"rawFile": {
"preview": "blob:http://localhost:3000/ff851b03-b2c0-4212-9240-8d07057ad47d"
},
"src": "blob:http://localhost:3000/ff851b03-b2c0-4212-9240-8d07057ad47d",
"title": "1397-01-20 13.43.24.jpg"
}
]
I read this link and try to upload the file: React dropzone, how to upload image?
But I think the file will not be sent.
This is my code:
let formData = new FormData();
formData.append('file', picture[0]);
fetch('http://localhost:8000/api/media', {
method: 'POST',
body: formData
});
If this method is not correct, How to send the file to the server side and receive it on the server side?
On the server side, I'm using Hapij.
I solved the problem. I write the answer because anybody didn't answer this question.
In the client side, I use the FileReader API to read the BLOB data and convert it to base64 readable format. I write a function to convert blob to base64 and send fileName and base64 to the server side.
const convertFileToBase64 = file => new Promise((resolve, reject) => {
const reader = new FileReader();
reader.readAsDataURL(file.rawFile);
reader.onload = () => resolve({
fileName: file.title,
base64: reader.result
});
reader.onerror = reject;
});
On the server side, I write the buffer to the file by this function:
const fs = require("fs");
const Boom = require('boom');
function convertBase64ToFile(file) {
let base64Data = file.base64.split(',')[1];
fs.writeFile(`${__dirname}/../../uploads/${file.fileName}`, base64Data, 'base64', function(err) {
return Boom.badData(err);
});
// Other actions...
}
This method works for me perfectly.