Upload picture to Digital Ocean Space using Expo - javascript

I'm trying to upload the image result from the takePictureAsync function to a Digital Ocean Space using Expo. Currently the upload process using signed PUT URL seems to work fine, but something is going wrong during the encoding process. Below I've included the relevant code:
const pictureResponse = await camera.current.takePictureAsync({ base64: true });
const spacesBase64 = `data:image/png;base64,${pictureResponse.base64}`;
const spacesBuffer = Buffer.from(spacesBase64, "base64");
const spacesBlob = new Blob([spacesBuffer]);
const spacesFile = new File([spacesBlob], "test.jpg", { type: "image/jpeg" });
fetch(`https://signedputurl.com`, { method: 'PUT', body: spacesFile });
When I take a picture it shows up on my Digital Ocean Space just fine. The file size also seems correct. When I try to preview the URL it doesn't render. I've tried removing the data:image/png;base64 prefix, but this doesn't fix the problem.
I've made the image result public and it can be viewed at https://disposable-dev.ams3.digitaloceanspaces.com/with_base64_prefix.jpg, I figured it might be helpfull.

I've figured out a solution! Instead of parsing the base64 results back into a blob. I can just use a fetch request to get the blob from cache.
const pictureResponse = await camera.current.takePictureAsync();
const spacesRespond = await fetch(pictureResponse.uri);
const spacesBlob = await spacesRespond.blob();
fetch(`https://signedputurl.com`, { method: 'PUT', body: spacesBlob });

Related

Convert Buffer to PDF

I am building a PDF using Puppeteer on the server and returning it as a Buffer to the client for download. I can't figure out why this conversion is not working for download. If I download it on the server side, everything works fine. Where am I going wrong?
onSuccess(data) {
const blob = new Blob([data as Buffer], { type: 'application/pdf' })
const link = document.createElement('a')
link.href = window.URL.createObjectURL(blob)
link.download = 'test.pdf'
link.click()
},
This always happens, I figure it out right after posting.
The answer is nuanced. I'm using React Query to get the data. The Buffer coming back from Node and Puppeteer looks like:
{ type: Buffer, data: [...] }
The data to convert is on the data property, not the Buffer itself. So the correct code looks like this:
onSuccess(data) {
const blob = new Blob([data.data as Buffer], { type: 'application/pdf' })
const link = document.createElement('a')
link.href = window.URL.createObjectURL(blob)
link.download = 'test.pdf'
link.click()
},
This felt hacky (and made TS upset) so I kept searching and decided to convert the PDF to base64 before returning to client and that seems be working as well. On the server that looks like:
const pdf = await page.pdf()
const base64 = pdf.toString('base64')
return `data:application/pdf;base64,${base64}`

Download PDF file form embed tag using Puppeteer

I am trying to download a pdf from a Website.
The website is made with the framework ZK, and it reveals a dynamic URL to the PDF for a window of time when an id number type in a input bar. This step is easy enough and I a able to get the PDF URL which opens up in the browser on a embedded tag.
However, it has been impossible for me to find a way to download the file to my computer. For days, I have tried and read everything from this, to this, to this.
The closes thing I have been able to get with this code:
let [ iframe ] = await page.$x('//iframe');
let pdf_url = await page.evaluate( iframe => iframe.src, iframe)
let res = await page.evaluate( async url =>
await fetch(url, {
method: 'GET',
credentials: 'same-origin', // usefull when we are logged into a website and want to send cookies
responseType: 'arraybuffer', // get response as an ArrayBuffer
}).then(response => response.text()),
pdf_url
)
console.log('res:', res);
//const response = await page.goto(pdf);
fs.writeFileSync('somepdf.pdf', res);
This results in a blank PDF file which is of 92K in size.
While the file I am trying to get is of 52K. I suspect the back-end might be sending me 'dummy' pdf file because my headers on the fetch request might not be correct.
What else can I try?
Here is the link to the PDF page.
You can use the random ID number I found: '1705120630'

React Native Expo - FileSystem readAsStringAsync Byte Allocation Failed (Out of Memory)

I am creating an Android App using React Native with Expo Module (FileSystem and Expo AV) to record a local video using the phone's camera, then I send the encoded base64 video to the server.
The code to send the base64 string looks like this:
const encodeBase64 = async () => {
const fileUri = videoUri;
const options = {
encoding: FileSystem.EncodingType.Base64,
};
let result = await FileSystem.readAsStringAsync(fileUri, options);
return result;
};
const upload = async () => {
const base64 = await encodeBase64(videoUri);
const result = await myAPI(base64);
}
It works on my phone (Oppo A3s), but on another phone like Samsung A51, it gives memory allocation error like this:
How to solve this problem?
This is memory error.
Every phone's storage is different each other.
You can use chunk buffer.
So in this case you can split your base64 data to post to server and combine data in server.
ex: client=> chunkbuffer=>1024*100(size)
server=> combine(array of client's data)
Good luck.
If you have any question please contact me.
I will help you anytime.

Storing and retrieving a base64 encoded string in Firebase storage

I have a Base64 encoded string (this is AES encrypted string).
I am trying to store it in Firebase Storage and then download it from it.
I have tried multiple options e.g
pathReference.putString(data, 'base64')
This does not retain the the base64 string in storage but converts it into integers. I have also tried providing a {contentType: "application/Base64"} but putString doesn't seem to work.
I then tried making it a blob
blob = new Blob([data], {type: "application/Base64"})
await pathReference.put(blob)
With this I am able to get the base64 encoded string in storage (though there are newlines added in string)
When I download it with ES6 fetch I am not getting back the string
const url = await pathReference.getDownloadURL()
const response = await fetch(url)
const data = await response.blob()
Instead getting an error Unhandled promise rejection: URIError: URI error
I am just looking for a very simple upload and download sample for base64 encoded string to firebase storage.
Any help is greatly appreciated.
I was able to make it work, though some firebase / fetch with react-native behavior is still unclear.
To upload a base64 encoded string to firebase storage I used the following snippet.
Here "data" is already a Base64 encoded string.
const pathReference = storage.ref(myFirebaseStorageLocation)
const blob = new Blob([data], {type: "application/Base64"})
await pathReference.put(blob)
I verified the contents in Firebase storage and downloaded the file manually which also looked fine.
Then to download under a React Native, Expo project there were several roadblocks but what finally worked was this
I had to add a btoa() function in global namespace.
Used the following code to download and then read it back as a Base64 string (which was surprisingly hard to get to)
Code to download the file and read back as Base64 string.
const fetchAsBlob = url => fetch(url)
.then(response => response.blob());
const convertBlobToBase64 = blob => new Promise((resolve, reject) => {
const reader = new FileReader;
reader.onerror = reject;
reader.onload = () => {
resolve(reader.result);
};
reader.readAsDataURL(blob);
});
const url = await pathReference.getDownloadURL()
const blob = await fetchAsBlob(url)
const doubleBase64EncodedFile = await convertBlobToBase64(blob)
const doubleEncodedBase64String = doubleBase64EncodedFile.split(',')[1]
const myBase64 = Base64.atob(doubleEncodedBase64String)
The caveat was that the FileReader reads the content and encodes it again into Base64 (so there is double encoding). I had to use the Base64.atob() to get back my original Base64 encoded string.
Again this may be unique to the situation where there is fetch being called under a React Native Expo project, both of which have some additional quirks when it comes to handling blobs or Base64.
(PS: I tried using response.blob(), response.buffer() and tried everything including libs to convert Blobs to Base64 strings but ran into one or the other issue, I also tried using Expo FileSystem, download file locally and read using FileSystem.readAsStringAsync, but it ran into native issues with iOS. tl;dr; the above solution worked but if someone can provide any explanation or clarity on all other attempts or a better solution then it will be greatly appreciated.
Also unclear is why firebase storage putString(data, 'base64') does not work.)

POST cutting off PDF data

I posted a question yesterday (linked here) where I had been trying to send a PDF to a database, and then retrieve it a later date. Since then I have been advised that it is best to (in my case as I cannot use Cloud Computing services) to upload the PDF files to local storage, and save the URL of the file to the database instead. I have now begun implementing this, but I have come across some trouble.
I am currently using FileReader() as documented below to process the input file and send it to the server:
var input_file = "";
let reader = new FileReader();
reader.readAsText(document.getElementById("input_attachment").files[0]);
reader.onloadend = function () {
input_file = "&file=" + reader.result;
const body = /*all the rest of my data*/ + input_file;
const method = {
method: "POST",
body: body,
headers: {
"Content-type": "application/x-www-form-urlencoded"
}
};
After this bloc of code I do the stock standard fetch() and a route on my server receives this. Almost all data comes in 100% as expected, but the file comes in cut off somewhere around 1300 characters in (making it quite an incomplete PDF). What does appear to come in seems to match the first 1300 characters of the original PDF I uploaded.
I have seen suggestions that you are meant to use "multipart/form-data" content-type to upload files, but when I do this I seem to only then receive the first 700 characters or so of my PDF. I have tried using the middleware Multer to handle the "multipart/form-data" but it just doesn't seem to upload anything (though I can't guarantee that I am using it correctly).
I also initially had trouble with fetch payload too large error message, but have currently resolved this through this method:
app.use(bodyParser.urlencoded({ limit: "50mb", extended: false, parameterLimit: 50000 }));
Though I have suspicions that this may not be correctly implemented as I have seen some discussion that the urlencoded limit is set prior to the file loading, and cannot be changed in the middle of the program.
Any and all help is greatly appreciated, and I will likely use any information here to construct an answer on my original question from yesterday so that anybody else facing these sort of issues have a resource to go to.
I personally found the solution to this problem as follows. On the client-side of my application this code is an example of what was implemented.
formData = new FormData();
formData.append("username", "John Smith");
formData.append("fileToUpload", document.getElementById("input_attachment").files[0]);
const method = {
method: "POST",
body: formData
};
fetch(url, method)
.then(res => res.json())
.then(res => alert("File uploaded!"))
.catch(err => alert(err.message))
As can be noted I have changed from using "application/x-www-form-urlencoded" encoding to "multipart/form-data" to upload files. nodeJS and Express however do not natively support this encoding type. I chose to use the library Formidable (found this to be easiest to use without too much overhead) which can be investigated about here. Below is an example of my server-side implementation of this middleware (Formidable).
const express = require('express');
const app = express();
const formidable = require('formidable');
app.post('/upload', (req, res) => {
const form = formidable({ uploadDir: `${__dirname}/file/`, keepExtensions: true });
form.parse(req, (err, fields, files) => {
if (err) console.log(err.stack);
else {
console.log(fields.username);
});
});
The file(s) are automatically uploaded to the directory specified in uploadDir, and the keepExtensions ensures that the file extension is saved as well. The non-file inputs are accessible through the fields object as seen through the fields.username example above.
From what I have found, this is the easiest method to take to setup an easy file upload system.

Categories

Resources