Storing and retrieving a base64 encoded string in Firebase storage - javascript

I have a Base64 encoded string (this is AES encrypted string).
I am trying to store it in Firebase Storage and then download it from it.
I have tried multiple options e.g
pathReference.putString(data, 'base64')
This does not retain the the base64 string in storage but converts it into integers. I have also tried providing a {contentType: "application/Base64"} but putString doesn't seem to work.
I then tried making it a blob
blob = new Blob([data], {type: "application/Base64"})
await pathReference.put(blob)
With this I am able to get the base64 encoded string in storage (though there are newlines added in string)
When I download it with ES6 fetch I am not getting back the string
const url = await pathReference.getDownloadURL()
const response = await fetch(url)
const data = await response.blob()
Instead getting an error Unhandled promise rejection: URIError: URI error
I am just looking for a very simple upload and download sample for base64 encoded string to firebase storage.
Any help is greatly appreciated.

I was able to make it work, though some firebase / fetch with react-native behavior is still unclear.
To upload a base64 encoded string to firebase storage I used the following snippet.
Here "data" is already a Base64 encoded string.
const pathReference = storage.ref(myFirebaseStorageLocation)
const blob = new Blob([data], {type: "application/Base64"})
await pathReference.put(blob)
I verified the contents in Firebase storage and downloaded the file manually which also looked fine.
Then to download under a React Native, Expo project there were several roadblocks but what finally worked was this
I had to add a btoa() function in global namespace.
Used the following code to download and then read it back as a Base64 string (which was surprisingly hard to get to)
Code to download the file and read back as Base64 string.
const fetchAsBlob = url => fetch(url)
.then(response => response.blob());
const convertBlobToBase64 = blob => new Promise((resolve, reject) => {
const reader = new FileReader;
reader.onerror = reject;
reader.onload = () => {
resolve(reader.result);
};
reader.readAsDataURL(blob);
});
const url = await pathReference.getDownloadURL()
const blob = await fetchAsBlob(url)
const doubleBase64EncodedFile = await convertBlobToBase64(blob)
const doubleEncodedBase64String = doubleBase64EncodedFile.split(',')[1]
const myBase64 = Base64.atob(doubleEncodedBase64String)
The caveat was that the FileReader reads the content and encodes it again into Base64 (so there is double encoding). I had to use the Base64.atob() to get back my original Base64 encoded string.
Again this may be unique to the situation where there is fetch being called under a React Native Expo project, both of which have some additional quirks when it comes to handling blobs or Base64.
(PS: I tried using response.blob(), response.buffer() and tried everything including libs to convert Blobs to Base64 strings but ran into one or the other issue, I also tried using Expo FileSystem, download file locally and read using FileSystem.readAsStringAsync, but it ran into native issues with iOS. tl;dr; the above solution worked but if someone can provide any explanation or clarity on all other attempts or a better solution then it will be greatly appreciated.
Also unclear is why firebase storage putString(data, 'base64') does not work.)

Related

Firebase Storage TypeError: file.getBlob is not a function

In my app I would like to download files (.pdf and .xls) from Firebase storage.
I can generate a download link using ref.getDownloadURL(), but that link causes the browser to open a new tab before downloading or opening the target file.
In order to avoid this behavior, I can download the the file in javascript using res = await fetch([the download URL]) and then await res.blob()
THIS WORKS FINE:
const res = await fetch([downloadURL]);
const blob = await res.blob();
const url = URL.createObjectURL(blob);
const anchor = document.createElement("a");
anchor.href = url;
anchor.download = [file_name]; // e.g. my_file.xls
anchor.click();
However, the documentation suggests that I can download the blob directly via the SDK using getBlob()
I have tried to download a list of the files in my storage bucket and to loop over these to get the blob as follows:
const storageRef = storage.ref().child("my_files");
const list = await storageRef.listAll();
const blob_array = await Promise.all(
list.items
.map(async (file) => {
const blob= await file.getBlob();
return {
name: file.name,
blob: blob,
};
})
);
however, I get an error:
TypeError: file.getBlob is not a function
The documentation for getBlob() states:
To use this functionality, you have to whitelist your app's origin in
your Cloud Storage bucket. See also
https://cloud.google.com/storage/docs/configuring-cors
In order to test this, I have followed the referenced documentation and, using gsutil, I have set the CORS policy for my storage bucket as follows:
[
{
"origin": ["*"],
"method": ["GET"],
"maxAgeSeconds": 3600
}
]
I thought that this would have the effect of 'whitelisting' my apps origin (per the getBlob() documentation), but instead I still get an error as follows:
TypeError: file.getBlob is not a function
How can I enable this getBlob() functionality?
The problem is exactly that stated by the error - you're trying to call getBlob on an object that doesn't have that method.
Firstly, take a look at the documentation for downloading files. It states that since SDK version 9.5 there is a global getBlob function. You're not using a new enough version of the SDK to get that function. You're using the old version 8, and Reference objects don't have a getBlob method (scan the API documentation v8 for Reference and you just won't see it).
If you want to use getBlob, you'll have to upgrade to the v9 SDK and rewrite all your Firebase code to conform to the new APIs. Then you will be able to use the new getBlob.
I think the problem is that you are trying to call a method on the StorageReference, when in fact you need to use the global getBlob function, with the StorageReference as the first argument in the function call.
Look at this example:
import { getBlob } from "#firebase/storage"
import type { StorageReference } from "firebase/storage"
export const downloadFile = (fromReference: StorageReference) => {
getBlob(fromReference)
.then((blob) => {
console.log('blob: ', blob)
})
.catch((error) => {
console.log('error downloading file: ', error)
})
}

How to upload an image of File type to Firebase Storage from Node.js with the Admin SDK

I have Angular running on the FrontEnd and Firebase Admin SDK for Node.js on the BackEnd.
What I want to achieve is to allow the user to select an image from his computer, using a simple <input> of type file. When I receive the user image which is of type File on the Angular side, I want to send this to my Node.js server and let him upload it to the Firebase Storage.
Here's how I'm sending the image to Node.js:
method(imageInput): void {
const image: File = imageInput.files[0];
const reader = new FileReader();
reader.addEventListener('load', (event: any) => {
const imageData = {
source: event.target.result,
file: image
}
this.myService.uploadImage(imageData.file).subscribe(
(res) => {
// image sent successfully
},
(err) => {
// error
})
});
reader.readAsDataURL(image);
}
So on the Node.js side I don't see a way to upload this image.
I'm trying:
admin.storage().bucket().upload(imageFromAngular, { --> Here's the problem
destination: "someDestination/",
contentType: "image/png",
metadata: {
contentType: "image/png"
}
}).then(() => {
// send successful response
}).catch(err => {
// send error response
});
The issue comes from the fact that the upload method only takes as a parameter the path to the image and the options. However in this case I can't pass the path to the image, rather I can pass the image itself. I read this - https://googleapis.dev/nodejs/storage/latest/ but I couldn't find anything that would suit my needs.
What would be the correct way to do this ?
Update:
Here's a more detailed explanation to the approach I took:
I'm using the arrayBuffer method of the image File inside my method. This method returns a promise of type ArrayBuffer. I get the value and send it to my Server.
The Server uses Buffer.from(ArrayBuffer, 'base64') to convert the data and then I can safely use the save API (https://googleapis.dev/nodejs/storage/latest/File.html#save).
To get the image later on I use download - (https://googleapis.dev/nodejs/storage/latest/File.html#download).
You can write a byte stream (or a Buffer) to Cloud Storage.
createWriteStream() API for streaming data to Cloud Storage: https://googleapis.dev/nodejs/storage/latest/File.html#createWriteStream
save() API for writing buffered data to Cloud Storage: https://googleapis.dev/nodejs/storage/latest/File.html#save

Upload file from external url to S3

I'm trying to upload an image to S3 from an external url.
Currently, I can successfully upload a file, but after I download and open it I see that it is corrupted.
Here is the code I'm using (got is just what I use to fetch the resource):
const got = require('got');
const Web3 = require('web3');
const s3 = new AWS.S3({
accessKeyId: AWS_ACCESS_KEY_ID,
secretAccessKey: AWS_SECRET_ACCESS_KEY,
});
const response = await got('https://example.com/image.jpg');
const uploadedFile = await s3
.upload({
Bucket: 'my_bucket',
Key: 'images/',
Body: response.body,
ContentType: 'image/jpeg',
})
.promise();
I tried to create a buffer, and use putObject instead of upload, but I end up with with files that are only a few bytes on S3 instead.
The request to get the object is converting it to a string. Pretty much whatever encoding you pick to do that will corrupt it, since a JPG is binary data not meant to be represented with an string's encoding.
The documentation for the got library states:
encoding
Type: string
Default: 'utf8'
Encoding to be used on setEncoding of the response data.
To get a Buffer, you need to set responseType to buffer instead. Don't set this option to null.
In other words, if you change your download to:
const response = await got('https://example.com/image.jpg', {'responseType': 'buffer'});
You'll get and upload a Buffer object without changing it by encoding it as a string.
Your key is wrong when you are uploading the file to S3:
Key: 'images/'
It cannot be images/ because that would upload the image to an object that represents a folder. While that might work with the local file system on your Windows/Mac laptop, it doesn't work with object stores. It needs to be a key that represents a file, for example:
Key: 'images/image.jpg'
Doing it through streams as mentioned by Ermiya Eskandary seems to work:
const response = got.stream('https://example.com/image.jpg');

Use pdf-table-extractor directly with axios?

This script is run in server using NodeJS,
I want to use pdf-table-extractor with remote file input directly from axios, is that can be done ?
here is what i have tried
const axios = require('axios')
const pdf_table_extractor = require("pdf-table-extractor")
const getPDF = await axios.get(`domain/a.pdf`,{responseType: 'arraybuffer'})
pdf_table_extractor(new Uint8Array(getPDF.data))
the error say The argument 'path' must be a string or Uint8Array without null bytes. Received Uint8Array(118456)
pdf-table-extractor is expecting a filepath and you are passing a typed array. It can't work this way.
There are many options, one of them is to save the data from getPDF.data to disk using writeFile, and then provide the path of the saved file to pdf_table_extractor.

React Native Expo - FileSystem readAsStringAsync Byte Allocation Failed (Out of Memory)

I am creating an Android App using React Native with Expo Module (FileSystem and Expo AV) to record a local video using the phone's camera, then I send the encoded base64 video to the server.
The code to send the base64 string looks like this:
const encodeBase64 = async () => {
const fileUri = videoUri;
const options = {
encoding: FileSystem.EncodingType.Base64,
};
let result = await FileSystem.readAsStringAsync(fileUri, options);
return result;
};
const upload = async () => {
const base64 = await encodeBase64(videoUri);
const result = await myAPI(base64);
}
It works on my phone (Oppo A3s), but on another phone like Samsung A51, it gives memory allocation error like this:
How to solve this problem?
This is memory error.
Every phone's storage is different each other.
You can use chunk buffer.
So in this case you can split your base64 data to post to server and combine data in server.
ex: client=> chunkbuffer=>1024*100(size)
server=> combine(array of client's data)
Good luck.
If you have any question please contact me.
I will help you anytime.

Categories

Resources