How to upload files using Axios when we know the full path? - javascript

I try to upload file using Axios but I want to just use string of file path. Using code below it is working:
<input
id="select-files"
style="visibility: hidden"
type="file"
multiple
#change="handleFilesUpload($event)"
/>
But when I tried to use createReadStream it does not work. I wonder how I could convert these path files to event.target.files.
I already try the code above but it does not work:
let data = {
THE_FILE: "",
BRANCH_ID: this.$store.state.starv.localUser.DOCTOR_INFO["BRANCH_ID"],
ACC_NO: this.locationItem["ACC_NO"],
CHART_NO: this.locationItem["CHART_NO"],
EMP_ID: this.$store.state.starv.localUser.DOCTOR_INFO["EMP_ID"],
CO_EMP_ID: this.doctorList.toString(),
ST: "telehealthclient",
NEW_NAME: "",
MAID: LocalData.getComputerId(),
}
/*
Iterate over any file sent over appending the files to the form data.
*/
data["THE_FILE"] = window.fs.createReadStream(filePath)
let bodyFormData = new FormData()
// if (THE_FILE) {
// bodyFormData.append("THE_FILE", THE_FILE)
// }
for (let key in data) {
bodyFormData.append(key, data[key])
}

I already found the solution to this problem, what we would to do are below:
Encode our file to base64
base64_encode(file) {
// read binary data
let bitmap = window.fs.readFileSync(file);
// convert binary data to base64 encoded string
return new Buffer(bitmap).toString("base64");
},
Create a file-url object from our base64
dataURLtoFile(dataurl, filename) {
const arr = dataurl.split(",");
const mime = arr[0].match(/:(.*?);/)[1];
const bstr = atob(arr[1]);
let n = bstr.length;
const u8arr = new Uint8Array(n);
while (n) {
u8arr[n - 1] = bstr.charCodeAt(n - 1);
n -= 1; // to make eslint happy
}
return new File([u8arr], filename, { type: mime });
},
Create form data from form-data library Upload using Axios
form-multipart
Full code is below
base64_encode(file) {
// read binary data
let bitmap = window.fs.readFileSync(file);
// convert binary data to base64 encoded string
return new Buffer(bitmap).toString("base64");
},
dataURLtoFile(dataurl, filename) {
const arr = dataurl.split(",");
const mime = arr[0].match(/:(.*?);/)[1];
const bstr = atob(arr[1]);
let n = bstr.length;
const u8arr = new Uint8Array(n);
while (n) {
u8arr[n - 1] = bstr.charCodeAt(n - 1);
n -= 1; // to make eslint happy
}
return new File([u8arr], filename, { type: mime });
},
uploadScreenRecord(data) {
return new Promise((resolve, reject) => {
// #1 Convert to base64 first
let base64_video = this.base64_encode(data.file);
// #2 Create file url object from base64
let filename = path.parse(data.file).base;
let fileURLobject = this.dataURLtoFile(
"data:video/mp4;base64," + base64_video,
filename
);
// #3 Create form data from form-data libary
const formData = new formData_();
formData.append("THE_FILE", fileURLobject, filename);
for (let key in data) {
if (key != "file") {
formData.append(key, data[key]);
}
}
// #4 Send to server
let url;
if (SETTING.imedtacDomain.hostname == undefined) {
url = SETTING.webservice.imedtacProtocol + "//" + defaultDomain;
} else {
url =
SETTING.webservice.imedtacProtocol + "//" + SETTING.imedtacDomain.hostname;
}
axios
.post(url + SETTING.imedtacAPI.uploadFile.URL, formData, {
headers: {
"Access-Control-Allow-Origin": "*",
"Content-Type": "multipart/form-data",
},
timeout: 30000,
})
.then(function (response) {
//Return Patient
console.log("[File debug] This is upload file response %o", response);
if (response.data.success) {
resolve(response.data.FILE_URL);
} else {
reject(response.data.message + " screen record upload error");
}
})
.catch(function (error) {
reject(error);
});
});
},
submitFilesTest() {
return new Promise((resolve, reject) => {
//Data
let data = {
file: "/Users/ivanhutomo/Downloads/B0120221214151850_A.mp4",
BRANCH_ID: "xxx",
ACC_NO: "xx",
CHART_NO: "xx",
EMP_ID: "xx",
CO_EMP_ID: "xx",
ST: "xx",
NEW_NAME: "",
MAID: "xx",
};
this.uploadScreenRecord(data)
.then((response) => {
logger.debug("[File debug] File upload URL %o", response);
resolve(response);
})
.catch((error) => {
logger.debug("[File debug] File %o", error);
reject(error);
});
});
},
submitFilesTest()

Related

Upload byte array from axios to Node server

Background
Javascript library for Microsoft Office add-ins allows you to get raw content of the DOCX file through getFileAsync() api, which returns a slice of up to 4MB in one go. You keep calling the function using a sliding window approach till you have reed entire content. I need to upload these slices to the server and the join them back to recreate the original DOCX file.
My attempt
I'm using axios on the client-side and busboy-based express-chunked-file-upload middleware on my node server. As I call getFileAsync recursively, I get a raw array of bytes that I then convert to a Blob and append to FormData before posting it to the node server. The entire thing works and I get the slice on the server. However, the chunk that gets written to the disk on the server is much larger than the blob I uploaded, normally of the order of 3 times, so it is obviously not getting what I sent.
My suspicion is that this may have to do with stream encoding, but the node middleware does not expose any options to set encoding.
Here is the current state of code:
Client-side
public sendActiveDocument(uploadAs: string, sliceSize: number): Promise<boolean> {
return new Promise<boolean>((resolve) => {
Office.context.document.getFileAsync(Office.FileType.Compressed,
{ sliceSize: sliceSize },
async (result) => {
if (result.status == Office.AsyncResultStatus.Succeeded) {
// Get the File object from the result.
const myFile = result.value;
const state = {
file: myFile,
filename: uploadAs,
counter: 0,
sliceCount: myFile.sliceCount,
chunkSize: sliceSize
} as getFileState;
console.log("Getting file of " + myFile.size + " bytes");
const hash = makeId(12)
this.getSlice(state, hash).then(resolve(true))
} else {
resolve(false)
}
})
})
}
private async getSlice(state: getFileState, fileHash: string): Promise<boolean> {
const result = await this.getSliceAsyncPromise(state.file, state.counter)
if (result.status == Office.AsyncResultStatus.Succeeded) {
const data = result.value.data;
if (data) {
const formData = new FormData();
formData.append("file", new Blob([data]), state.filename);
const boundary = makeId(12);
const start = state.counter * state.chunkSize
const end = (state.counter + 1) * state.chunkSize
const total = state.file.size
return await Axios.post('/upload', formData, {
headers: {
"Content-Type": `multipart/form-data; boundary=${boundary}`,
"file-chunk-id": fileHash,
"file-chunk-size": state.chunkSize,
"Content-Range": 'bytes ' + start + '-' + end + '/' + total,
},
}).then(async res => {
if (res.status === 200) {
state.counter++;
if (state.counter < state.sliceCount) {
return await this.getSlice(state, fileHash);
}
else {
this.closeFile(state);
return true
}
}
else {
return false
}
}).catch(err => {
console.log(err)
this.closeFile(state)
return false
})
} else {
return false
}
}
else {
console.log(result.status);
return false
}
}
private getSliceAsyncPromise(file: Office.File, sliceNumber: number): Promise<Office.AsyncResult<Office.Slice>> {
return new Promise(function (resolve) {
file.getSliceAsync(sliceNumber, result => resolve(result))
})
}
Server-side
This code is totally from the npm package (link above), so I'm not supposed to change anything in here, but still for reference:
makeMiddleware = () => {
return (req, res, next) => {
const busboy = new Busboy({ headers: req.headers });
busboy.on('file', (fieldName, file, filename, _0, _1) => {
if (this.fileField !== fieldName) { // Current field is not handled.
return next();
}
const chunkSize = req.headers[this.chunkSizeHeader] || 500000; // Default: 500Kb.
const chunkId = req.headers[this.chunkIdHeader] || 'unique-file-id'; // If not specified, will reuse same chunk id.
// NOTE: Using the same chunk id for multiple file uploads in parallel will corrupt the result.
const contentRangeHeader = req.headers['content-range'];
let contentRange;
const errorMessage = util.format(
'Invalid Content-Range header: %s', contentRangeHeader
);
try {
contentRange = parse(contentRangeHeader);
} catch (err) {
return next(new Error(errorMessage));
}
if (!contentRange) {
return next(new Error(errorMessage));
}
const part = contentRange.start / chunkSize;
const partFilename = util.format('%i.part', part);
const tmpDir = util.format('/tmp/%s', chunkId);
this._makeSureDirExists(tmpDir);
const partPath = path.join(tmpDir, partFilename);
const writableStream = fs.createWriteStream(partPath);
file.pipe(writableStream);
file.on('end', () => {
req.filePart = part;
if (this._isLastPart(contentRange)) {
req.isLastPart = true;
this._buildOriginalFile(chunkId, chunkSize, contentRange, filename).then(() => {
next();
}).catch(_ => {
const errorMessage = 'Failed merging parts.';
next(new Error(errorMessage));
});
} else {
req.isLastPart = false;
next();
}
});
});
req.pipe(busboy);
};
}
Update
So it looks like I have found the problem at least. busboy appears to be writing my array of bytes as text in the output file. I get 80,75,3,4,20,0,6,0,8,0,0,0,33,0,44,25 (as text) when I upload the array of bytes [80,75,3,4,20,0,6,0,8,0,0,0,33,0,44,25]. Now need to figure out how to force it to write it as a binary stream.
Figured out. Just in case it helps anyone, there was no problem with busboy or office.js or axios. I just had to convert the incoming chunk of data to Uint8Array before creating a blob from it. So instead of:
formData.append("file", new Blob([data]), state.filename);
like this:
const blob = new Blob([ new Uint8Array(data) ])
formData.append("file", blob, state.filename);
And it worked like a charm.

Unable to upload file to S3 using Lambda written in NodeJS

I'm trying to upload a file (pdf/jpg) using a Lambda function written in NodeJS by triggering the request from PostMan but the code is not working at all. How should I apply the file? Is it in base64 format or using the type as 'File' in the body?
Here is my defective code:-
'use-strict'
const AWS = require("aws-sdk");
const logger = require('./logger').logger;
const moment = require('moment');
const fileType = ('file-type');
const { Buffer } = require('buffer');
//const { fileTypeFromFile } = 'file-type';
const ddbTable = process.env.RUNTIME_DDB_TABLE_FREE_USER_DOCUMENT;
const s3TempBucket = process.env.RUNTIME_S3_TEMP_BUCKET;
const s3 = new AWS.S3();
const getFile = (fileMime, buffer, userId) => {
let fileExt = fileMime.ext;
let hash = sha1(new Buffer(new Date().toString()));
let now = moment().format('YYYY-MM-DD HH:mm:ss');
let filePath = hash + '/';
let fileName = unixTime(now) + '.' + fileExt;
let fileFullName = filePath + fileName;
let fileFullPath = s3TempBucket + userId + fileFullName;
const params = {
Body: buffer,
Bucket: s3TempBucket,
Key: fileName
};
let uploadFile = {
size: buffer.toString('ascii').length,
type: fileMime.mime,
name: fileName,
fullPath: fileFullPath
}
return {
'params': params,
'uploadFile': uploadFile
}
}
exports.lambdaHandler = (event, context) => {
logger.info("Event::", event);
logger.info('Uploading file to bucket::', s3TempBucket);
let body, data;
let statusCode = 200;
const headers = {
'Content-Type': 'application/json',
'Access-Control-Allow-Headers': 'Content-Type',
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': '*'
};
//const user = event.body.user;
let request = event.body;
let base64String = request.base64String;
//let buffer = Buffer.from(JSON.stringify(base64String), 'base64');
let buffer = new Buffer(base64String, 'base64');
let fileMime = fileType(buffer);
logger.info(fileMime);
if (fileMime === null) {
return context.fail('String supplied is not file type');
}
//let file = getFile(fileMime, buffer, user.id);
let file = getFile(fileMime, buffer, 'b06eb6f4-0ff0-5cb5-a41c-e000af66c8e9');
let params = file.params;
try {
//await new Promise((resolve, reject) => {
s3.putObject(params, (err, results) => {
if (err) reject(err);
else {
console.log(results);
body = results;
resolve(results)
}
});
// });
} catch (err) {
logger.info(err);
statusCode = 400;
body = err.message;
return err;
} finally {
body = JSON.stringify(data);
}
return {
statusCode,
body,
headers
};
}
Following error is coming on CloudWatch at this line const buffer = Buffer.from(selectedFile, 'base64'):-
2022-01-30T07:20:33.019Z 4c916aca-93e2-4846-84a2-d7f048f1de52 ERROR Invoke Error
{
"errorType": "TypeError",
"errorMessage": "The first argument must be of type string or an instance of Buffer, ArrayBuffer, or Array or an Array-like Object. Received undefined",
"code": "ERR_INVALID_ARG_TYPE",
"stack": [
"TypeError [ERR_INVALID_ARG_TYPE]: The first argument must be of type string or an instance of Buffer, ArrayBuffer, or Array or an Array-like Object. Received undefined",
" at new NodeError (internal/errors.js:322:7)",
" at Function.from (buffer.js:334:9)",
" at Runtime.exports.lambdaHandler [as handler] (/var/task/app.js:62:18)",
" at Runtime.handleOnce (/var/runtime/Runtime.js:66:25)"
]
}
From Postman, I'm trying to send the document (in the base64 format) in the file attribute and name of the document in the name attribute as shown in the below screenshot:-
info: Event::
{
"body": "{\n \"base64String\": \"/9j/4AAQSkZJRgABAQAAAQABAAD/2wCEAAcHBwcIBwgJCQgMDAsMDBEQDg4QERoSFBIUEhonGB0YGB0YJyMqIiAiKiM+MSsrMT5IPDk8SFdOTldtaG2Pj8ABBwcHBwgHCAkJCAwMCwwMERAODhARGhIUEhQSGicYHRgYHRgnIyoiICIqIz4xKysxPkg8OTxIV05OV21obY+PwP/CABEICHAPAAMBIgACEQEDEQH/xAAcAAEAAgMBAQEAAAAAAAAAAAAABwgBBQYEAwL/2gAI

Download and convert c# excel byte array to javascript blob

I am using a POST method to get a byte array of an excel file
c# server side implementation:
downloadExcel() {
....
FileResultDto fileResultDto = new FileResultDto
{
Data = ExcelHelper.CreateExcelFile(excelFile) //Data contains the byte array
};
return new JsonHttpResponseMessage(JsonConvert.SerializeObject(fileResultDto));
}
CreateExcelFile():
public byte[] CreateExcelFile(ExcelFile excelFile)
{
try
{
#region Validation
if (excelFile == null)
{
throw new ArgumentNullException(nameof(excelFile));
}
#endregion
byte[] bytes;
using (ExcelPackage excelPackage = new ExcelPackage())
{
for (int i = 1; i <= excelFile.Worksheets.Count; i++)
{
Worksheet worksheet = excelFile.Worksheets[i - 1];
excelPackage.Workbook.Worksheets.Add(worksheet.Name);
ExcelWorksheet currentExcelWorksheet = excelPackage.Workbook.Worksheets[i];
if (excelFile.HasLogoTemplate)
{
byte[] imageBytes = Convert.FromBase64String(LogoBase64);
Image image;
using (MemoryStream ms = new MemoryStream(imageBytes, 0, imageBytes.Length))
{
image = Image.FromStream(ms, true);
}
ExcelPicture picture = currentExcelWorksheet.Drawings.AddPicture("Logo", image);
picture.SetPosition(0, 4, 0, 10);
currentExcelWorksheet.Row(1).Height = 50;
}
SetColumnsWidths(currentExcelWorksheet, worksheet);
WriteHeaderRow(currentExcelWorksheet, worksheet.HeaderRow);
WriteCells(currentExcelWorksheet, worksheet.Cells);
}
#region Set Excel Stream
bytes = excelPackage.GetAsByteArray();
#endregion
}
return bytes;
}
catch (Exception exception)
{
throw new Exception("There was an error on excel export. Exception: ", exception);
}
}
front end implementation:
public downloadExcel(): void {
this.myRepository.downloadExcel(this.postData).then(result => {
var byteArray = new Uint8Array(result.data.data);
var a = window.document.createElement('a');
a.href = window.URL.createObjectURL(new Blob([byteArray], { type: "application/octet-stream" }));
a.download = "test.xlsx";
document.body.appendChild(a);
a.click();
document.body.removeChild(a);
}, error => {
console.log(error);
});
}
Apparently the created blob file it seems to be corrupted. Any suggestions where the problem can be?
Finally the problem solved using 'arraybuffer' as the response type of the http request.
let requestConfiguration: ng.IRequestConfig = {
cache: ...,
data: ...,
headers: ...,
method: ...,
url: ...,
responseType: 'arraybuffer'
};
let promise: ng.IPromise<any> = this.$http(requestConfiguration);
The same is done in Angular. This might be help you
downloadExcel(){
this.downloadAttachment(filename).subscribe(res => {
let Res=res;
const downloadedFile = new Blob([Res], { type: Res.type });
            const a = document.createElement('a');
            a.setAttribute('style', 'display:none;');
            document.body.appendChild(a);
            a.download = attachment.filename;
            a.href = URL.createObjectURL(downloadedFile);
            a.target = '_blank';
            a.click();
            document.body.removeChild(a);
},
err=>{
throw err;
})
}
downloadAttachment(filename){
this.httpOptions = {
reportProgress: true,
responseType: "blob"
};
return this.http.get(API_URL, this.httpOptions).pipe(
map(response => response),
catchError(this.handleError<any>(isShowError)),
finalize(() => {
})
);
}
C# Code
var res=DownloadAttachment(filename);
if (res == null)
return Content("filename not present");
return File(res, Utility.GetContentType(filename), filename);

error code 2, security Error: typescript blob

I am trying to convert an image file to blob but get the error below:
{"code":2, "message":"SECURITY_ERR"}
Any idea why this might be happening? I tried a lot of online resources but really could not find out what is going on.
I have the following code:
choose(){
this.filechooser.open().then((uri)=>{
this.file.resolveLocalFilesystemUrl(uri).then((newurl)=>{
let dirPath = newurl.nativeURL;
alert(dirPath);
let dirPathsegments = dirPath.split('/')
dirPathsegments.pop();
dirPath = dirPathsegments.join('/');
this.file.readAsArrayBuffer(dirPath, newurl.name).then((buffer)=>{
let blob = new Blob([buffer], {type: "image/jpeg"});
alert('blob creation success');
}, Error=>{
alert('Blob Error '+ JSON.stringify(Error));
});
});
},Error=>{
alert('Error Choosing File ' + Error);
});
}
Is there any other way to convert the file into a blob?
The current APIs that I am using only accepts Binary or Multipart attachments
It works. Tested on ionic 2 & 3
uploadFile() {
this.fileChooser.open().then((url) => {
(<any>window).FilePath.resolveNativePath(url, (nativeFilepath) => {
this.readFileFromStorage(nativeFilepath);
}
)
})
}
readFileFromStorage(nativeFilepath) {
let fileName = this.getfilename(nativeFilepath);
let fileExt = fileName.substring(fileName.lastIndexOf('.') + 1);
let blogType = { type: 'image/'+fileExt };
(<any>window).resolveLocalFileSystemURL(nativeFilepath, (res) => {
res.file((resFile) => {
var reader = new FileReader();
reader.readAsArrayBuffer(resFile);
reader.onloadend = (evt: any) => {
var imgBlob = new Blob([evt.target.result], blogType);
//Upload blob to firebase
this.uploadToFirebase(imgBlob,fileName);
}
})
})
}
getfilename(filePath){
let fileName : string;
fileName = filePath.replace(/^.*[\\\/]/, '')
return fileName;
}
uploadToFirebase(fileBlob, name) {
let storage = firebase.storage();
storage.ref('images/' + name).put(fileBlob).then((d) => {
alert("Done");
}).catch((error) => {
alert("Error: " + JSON.stringify(error));
});
}

Convert image path to blob react native

Problem
I am trying to create an app with react native and firebase. One of the features I would like for this app is the ability to upload images. I am having some trouble uploading the images to firebase storage though. I am using expo's image picker to find the path of the image that the user wants to upload, but once I have the path I don't know how to convert that to something I can upload to firebase.
Can somebody help me convert the path of an image to something I can upload to firebase storage with react native?
What I've tried
I tried using:
_pickImage = async () => {
let result = await ImagePicker.launchImageLibraryAsync({
MediaTypeOptions: 'Images',
quality: 0.4,
_uploadAsByteArray = async (pickerResultAsByteArray, progressCallback) => {
try {
var metadata = {
contentType: 'image/jpeg',
};
var storageRef = firebase.storage().ref();
var ref = storageRef.child('images/'+expoID+'/'+this.state.time)
let uploadTask = ref.put(pickerResultAsByteArray, metadata)
uploadTask.on('state_changed', function (snapshot) {
progressCallback && progressCallback(snapshot.bytesTransferred / snapshot.totalBytes)
var progress = (snapshot.bytesTransferred / snapshot.totalBytes) * 100;
console.log('Upload is ' + progress + '% done');
}, function (error) {
console.log("in _uploadAsByteArray ", error)
}, function () {
var downloadURL = uploadTask.snapshot.downloadURL;
console.log("_uploadAsByteArray ", uploadTask.snapshot.downloadURL)
this.setState({imageUploaded:true})
});
} catch (ee) {
console.log("when trying to load _uploadAsByteArray ", ee)
}
}
convertToByteArray = (input) => {
var binary_string = this.atob(input);
var len = binary_string.length;
var bytes = new Uint8Array(len);
for (var i = 0; i < len; i++) {
bytes[i] = binary_string.charCodeAt(i);
}
return bytes
}
atob = (input) => {
const chars = 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/=';
let str = input.replace(/=+$/, '');
let output = '';
if (str.length % 4 == 1) {
throw new Error("'atob' failed: The string to be decoded is not correctly encoded.");
}
for (let bc = 0, bs = 0, buffer, i = 0;
buffer = str.charAt(i++);
~buffer && (bs = bc % 4 ? bs * 64 + buffer : buffer,
bc++ % 4) ? output += String.fromCharCode(255 & bs >> (-2 * bc & 6)) : 0
) {
buffer = chars.indexOf(buffer);
}
return output;
}
uploadImage(bsfdata){
this.setState({imageUploaded:false})
this._uploadAsByteArray(this.convertToByteArray(bsfdata), (progress) => {
this.setState({ progress:progress })
})
}
base64:true,
});
/* if (!result.cancelled) {
this.setState({ image: result.uri });
let formData = new FormData();
formData.append('photo', {
uri,
name: `photo.${fileType}`,
type: `image/${fileType}`,
});}*/
this.uploadImage(result.base64);
};
}
I've tried it with the commented code added, which doesn't upload anything, and I've tried it with how the code is now, which gives me the error Can currently only create a Blob from other Blobs, and the uploading progress never gets above 0%.
If you are using expo (>=26), then you can do it easily with the following lines of code.
uploadImage = async(imageUri) => {
const response = await fetch(imageUri);
const blob = await response.blob();
var ref = firebase.storage().ref().child("image.jpg");
return ref.put(blob);
}
Reference: https://youtu.be/KkZckepfm2Q
Refer this link - https://github.com/dailydrip/react-native-firebase-storage/blob/master/src/App.js#L43-L69
Following block of code is working fine.
uploadImage(uri, mime = 'application/octet-stream') {
return new Promise((resolve, reject) => {
const uploadUri = Platform.OS === 'ios' ? uri.replace('file://', '') : uri
let uploadBlob = null
const imageRef = FirebaseClient.storage().ref('images').child('image_001')
fs.readFile(uploadUri, 'base64')
.then((data) => {
return Blob.build(data, { type: `${mime};BASE64` })
})
.then((blob) => {
uploadBlob = blob
return imageRef.put(blob, { contentType: mime })
})
.then(() => {
uploadBlob.close()
return imageRef.getDownloadURL()
})
.then((url) => {
resolve(url)
})
.catch((error) => {
reject(error)
})
})
}
You need to install rn-fetch-blob module:
npm install --save rn-fetch-blob
Then, do the following:
import RNFetchBlob from 'rn-fetch-blob';
const Blob = RNFetchBlob.polyfill.Blob;
const fs = RNFetchBlob.fs;
window.XMLHttpRequest = RNFetchBlob.polyfill.XMLHttpRequest;
window.Blob = Blob;
function uploadImage(path) {
const imageFile = RNFetchBlob.wrap(path);
// 'path/to/image' is where you wish to put your image in
// the database, if you would like to put it in the folder
// 'subfolder' inside 'mainFolder' and name it 'myImage', just
// replace it with 'mainFolder/subfolder/myImage'
const ref = firebase.storage().ref('path/to/image');
var uploadBlob = null;
Blob.build(imageFile, { type: 'image/jpg;' })
.then((imageBlob) => {
uploadBlob = imageBlob;
return ref.put(imageBlob, { contentType: 'image/jpg' });
})
.then(() => {
uploadBlob.close();
return ref.getDownloadURL();
})
.((url) => {
// do something with the url if you wish to
})
.catch(() => {
dispatch({
type: UPDATE_PROFILE_INFO_FAIL,
payload: 'Unable to upload profile picture, please try again'
});
});
}
Please do ask if there's any part of the code that you don't understand. To upload multiple images, simply wrap this code with a for loop. Or if you want to make sure that every image is uploaded without any error, use Promise
Not sure whom this might help, but if you're using MediaLibrary to load images from the gallery, then the uri comes in the format of uri = file:///storage/emulated/0/DCIM/Camera/filename.jpg
In this case, using fetch(uri) didn't help me get the blob.
But if you use fetch(uri.replace("file:///","file:/")) and then follow #sriteja Sugoor's answer, you'll be able to upload the file blob.
const Blob = RNFetchBlob.polyfill.Blob;
const fs = RNFetchBlob.fs;
let uploadBlob;
await fs
.readFile(params?.file.path, 'base64')
.then((data) => {
return Blob.build(data, {type: `BASE64`});
})
.then((blob) => {
uploadBlob = blob;
console.log(uploadBlob, 'uploadBlob');
});

Categories

Resources