React Native Expo - FileSystem readAsStringAsync Byte Allocation Failed (Out of Memory) - javascript

I am creating an Android App using React Native with Expo Module (FileSystem and Expo AV) to record a local video using the phone's camera, then I send the encoded base64 video to the server.
The code to send the base64 string looks like this:
const encodeBase64 = async () => {
const fileUri = videoUri;
const options = {
encoding: FileSystem.EncodingType.Base64,
};
let result = await FileSystem.readAsStringAsync(fileUri, options);
return result;
};
const upload = async () => {
const base64 = await encodeBase64(videoUri);
const result = await myAPI(base64);
}
It works on my phone (Oppo A3s), but on another phone like Samsung A51, it gives memory allocation error like this:
How to solve this problem?

This is memory error.
Every phone's storage is different each other.
You can use chunk buffer.
So in this case you can split your base64 data to post to server and combine data in server.
ex: client=> chunkbuffer=>1024*100(size)
server=> combine(array of client's data)
Good luck.
If you have any question please contact me.
I will help you anytime.

Related

Upload picture to Digital Ocean Space using Expo

I'm trying to upload the image result from the takePictureAsync function to a Digital Ocean Space using Expo. Currently the upload process using signed PUT URL seems to work fine, but something is going wrong during the encoding process. Below I've included the relevant code:
const pictureResponse = await camera.current.takePictureAsync({ base64: true });
const spacesBase64 = `data:image/png;base64,${pictureResponse.base64}`;
const spacesBuffer = Buffer.from(spacesBase64, "base64");
const spacesBlob = new Blob([spacesBuffer]);
const spacesFile = new File([spacesBlob], "test.jpg", { type: "image/jpeg" });
fetch(`https://signedputurl.com`, { method: 'PUT', body: spacesFile });
When I take a picture it shows up on my Digital Ocean Space just fine. The file size also seems correct. When I try to preview the URL it doesn't render. I've tried removing the data:image/png;base64 prefix, but this doesn't fix the problem.
I've made the image result public and it can be viewed at https://disposable-dev.ams3.digitaloceanspaces.com/with_base64_prefix.jpg, I figured it might be helpfull.
I've figured out a solution! Instead of parsing the base64 results back into a blob. I can just use a fetch request to get the blob from cache.
const pictureResponse = await camera.current.takePictureAsync();
const spacesRespond = await fetch(pictureResponse.uri);
const spacesBlob = await spacesRespond.blob();
fetch(`https://signedputurl.com`, { method: 'PUT', body: spacesBlob });

How can I write a Buffer directly to websocket-stream in Node.js without converting to string?

This one is a bit dense. I'm building a web-socket based FUSE filesystem in Node.js (v14.14.0) using the fuse-native package.
To transfer the file data between the client and the server, I'm using the websocket-stream package to stream the binary data back and forth.
This works fine when transferring a file from the server to the client, but when trying to transfer a file from the client to the server, I'm running into a problem.
fuse-native passes around Buffer instances with binary segments of the file being transferred. I'm trying to write the Buffer to a websocket-stream stream and receive it on the server, where it will be streamed to a temporary file.
Here's how this happens. On the client-side, the following method is called:
write(buffer) {
console.log('write buffer pre slice', buffer)
const write_stream = WebSocketStream(`ws://localhost:5746/?socket_uuid=${this.socket_uuid}&node_uuid=${this.node_uuid}&length=${this.length}&position=${this.position}&writing_file=true`, {
perMessageDeflate: false,
binary: true,
})
console.log(write_stream)
console.log('writing buffer', buffer.toString(), buffer)
write_stream.push(buffer)
write_stream.push(null)
}
According to the Node.js docs, I should be able to pass the Buffer directly to the stream. However, on the server, no data is ever received. Here's how the server is receiving:
async on_file_write_stream(stream, request) {
let { socket_uuid, node_uuid, length = 4096, position = 0 } = url.parse(request.url, true).query
if ( typeof position === 'string' ) position = parseInt(position)
if ( typeof length === 'string' ) length = parseInt(length)
const socket = this.sockets.find(x => x.uuid === socket_uuid)
if ( !socket.session.temp_write_files ) socket.session.temp_write_files = {}
const placeholder = socket.session.temp_write_files?.[node.uuid] || await tmp.file()
socket.session.temp_write_files[node.uuid] = placeholder
console.log('Upload placeholder:', placeholder)
console.log('write stream', stream)
console.log('write data', { placeholder, position, length })
stream.pipe(fs.createWriteStream(placeholder.path, { flags: 'r+', start: position }))
}
Once the client-side code finishes, the temporary file is still completely empty. No data is ever written.
The strange part is that when I cast the buffer to a string before writing it to the stream (on the client side), all works as expected:
write(buffer) {
console.log('write buffer pre slice', buffer)
const write_stream = WebSocketStream(`ws://localhost:5746/?socket_uuid=${this.socket_uuid}&node_uuid=${this.node_uuid}&length=${this.length}&position=${this.position}&writing_file=true`, {
perMessageDeflate: false,
binary: true,
})
console.log(write_stream)
console.log('writing buffer', buffer.toString(), buffer)
write_stream.push(buffer.toString())
write_stream.push(null)
}
This works fine for text-based files, but binary files become mangled when transferred this way. I suspect it's because of the string cast before transfer.
How can I send the buffer data along the stream without casting it to a string first? I'm not sure why the server-side stream isn't receiving data when I write the Buffer directly.
Thanks in advance.
For the curious, here is the full server-side file and the client-side file.
Kind of a hack, but I worked around the problem by base64 encoding the Buffer on the client-side and decoding it on the server side.
Thanks to user Bojoer for pointing me in the direction of the combined-stream library, which I used to pipe the Buffer to the stream:
const combined_stream = CombinedStream.create()
combined_stream.append(buffer.toString('base64'))
combined_stream.pipe(write_stream)
Then, decode it on the server-side:
const encoded_buffer = await this._bufferStream(stream)
const decoded_buffer = new Buffer(encoded_buffer.toString(), 'base64')
console.log({encoded_buffer, decoded_buffer})
const combined_stream = CombinedStream.create()
combined_stream.append(decoded_buffer)
combined_stream.pipe(fs.createWriteStream(placeholder.path, { flags: 'r+', start: position }))
This is sub-optimal, though, as converting back and forth from base64 takes processing, and it imposes a bandwidth penalty. I'm still curious about the original problem why I can't write a binary Buffer to the websocket stream. Maybe it's a limitation of the library.

Storing and retrieving a base64 encoded string in Firebase storage

I have a Base64 encoded string (this is AES encrypted string).
I am trying to store it in Firebase Storage and then download it from it.
I have tried multiple options e.g
pathReference.putString(data, 'base64')
This does not retain the the base64 string in storage but converts it into integers. I have also tried providing a {contentType: "application/Base64"} but putString doesn't seem to work.
I then tried making it a blob
blob = new Blob([data], {type: "application/Base64"})
await pathReference.put(blob)
With this I am able to get the base64 encoded string in storage (though there are newlines added in string)
When I download it with ES6 fetch I am not getting back the string
const url = await pathReference.getDownloadURL()
const response = await fetch(url)
const data = await response.blob()
Instead getting an error Unhandled promise rejection: URIError: URI error
I am just looking for a very simple upload and download sample for base64 encoded string to firebase storage.
Any help is greatly appreciated.
I was able to make it work, though some firebase / fetch with react-native behavior is still unclear.
To upload a base64 encoded string to firebase storage I used the following snippet.
Here "data" is already a Base64 encoded string.
const pathReference = storage.ref(myFirebaseStorageLocation)
const blob = new Blob([data], {type: "application/Base64"})
await pathReference.put(blob)
I verified the contents in Firebase storage and downloaded the file manually which also looked fine.
Then to download under a React Native, Expo project there were several roadblocks but what finally worked was this
I had to add a btoa() function in global namespace.
Used the following code to download and then read it back as a Base64 string (which was surprisingly hard to get to)
Code to download the file and read back as Base64 string.
const fetchAsBlob = url => fetch(url)
.then(response => response.blob());
const convertBlobToBase64 = blob => new Promise((resolve, reject) => {
const reader = new FileReader;
reader.onerror = reject;
reader.onload = () => {
resolve(reader.result);
};
reader.readAsDataURL(blob);
});
const url = await pathReference.getDownloadURL()
const blob = await fetchAsBlob(url)
const doubleBase64EncodedFile = await convertBlobToBase64(blob)
const doubleEncodedBase64String = doubleBase64EncodedFile.split(',')[1]
const myBase64 = Base64.atob(doubleEncodedBase64String)
The caveat was that the FileReader reads the content and encodes it again into Base64 (so there is double encoding). I had to use the Base64.atob() to get back my original Base64 encoded string.
Again this may be unique to the situation where there is fetch being called under a React Native Expo project, both of which have some additional quirks when it comes to handling blobs or Base64.
(PS: I tried using response.blob(), response.buffer() and tried everything including libs to convert Blobs to Base64 strings but ran into one or the other issue, I also tried using Expo FileSystem, download file locally and read using FileSystem.readAsStringAsync, but it ran into native issues with iOS. tl;dr; the above solution worked but if someone can provide any explanation or clarity on all other attempts or a better solution then it will be greatly appreciated.
Also unclear is why firebase storage putString(data, 'base64') does not work.)

Have array from Puppeteer/Cheerios program. Have ionic phone app design. How to move the array to the angular code?

This is my first phone app. I am using Ionic for the cross-platform work which uses Angular as you know I'm sure. I have a separate program which scrapes a webpage using puppeteer and cheerio and creates an array of values from the web page. This works.
I'm not sure how I get the array in my web scraping program read by my ionic/angular program.
I have a basic ionic setup and am just trying a most basic activity of being able to see the array from the ionic/angular side but after trying to put it in several places I realized I really didnt know where to import the code to ionic/angular which returns the array or where to put the webscraper code directly in one of the .ts files or ???
This is my web scraping program:
const puppeteer = require('puppeteer'); // live webscraping
let scrape = async () => {
const browser = await puppeteer.launch({
headless: true
});
const page = await browser.newPage();
await page.goto('--page url here --'); // link to page
const result = await page.evaluate(() => {
let data = []; // Create an empty array that will store our data
let elements = document.querySelectorAll('.list-myinfo-block'); // Select all Products
let photo_elements = document.getElementsByTagName('img'); //
var photo_count = 0;
for (var element of elements) { // Loop through each product getting photos
let picture_link = photo_elements[photo_count].src;
let name = element.childNodes[1].innerText;
let itype = element.childNodes[9].innerText
data.push({
picture_link,
name,
itype
}); // Push an object with the data onto our array
photo_count = photo_count + 1;
}
return data;
});
browser.close();
return result; // Return the data
};
scrape().then((value) => {
console.log(value); // Success!
});
When I run the webscraping program I see the array with the correct values in it. Its getting it into the ionic part of it. Sometimes the ionic phone page will show up with nothing in it, sometimes it says it cannot find "/" ... I've tried so many different places and looked all over the web that I have quite a combination of errors. I know I'm putting it in the wrong places - or maybe not everywhere I should. Thank you!
You need a server which will run the scraper on demand.
Any scraper that uses a real browser (ie: chromium) will have to run in a OS that supports it. There is no other way.
Think about this,
Does your mobile support chromium and nodeJS? It does not. There are no chromium build for mobile which supports automation with nodeJS (yet).
Can you run a browser inside another browser? You cannot.
Way 1: Remote wsEndpoint
There are some services which offers wsEndpoint but I will not mention them here. I will describe how you can create your own wsEndPoint and use it.
Run browser and Get wsEndpoint
The following code will launch a puppeteer instance whenever you connect to it. You have to run it inside a server.
const http = require('http');
const httpProxy = require('http-proxy');
const proxy = new httpProxy.createProxyServer();
http
.createServer()
.on('upgrade', async(req, socket, head) => {
const browser = await puppeteer.launch();
const target = browser.wsEndpoint();
proxyy.ws(req, socket, head, { target })
})
.listen(8080);
When you run this on the server/terminal, you can use the ip of the server to connect. In my case it's ws://127.0.0.1:8080.
Use puppeteer-web
Now you will need to install puppeteer-web on your mobile/web app. To bundle Puppeteer using Browserify follow the instruction below.
Clone Puppeteer repository:
git clone https://github.com/GoogleChrome/puppeteer && cd puppeteer
npm install
npm run bundle
This will create ./utils/browser/puppeteer-web.js file that contains Puppeteer bundle.
You can use it later on in your web page to drive another browser instance through its WS Endpoint:
<script src='./puppeteer-web.js'></script>
<script>
const puppeteer = require('puppeteer');
const browser = await puppeteer.connect({
browserWSEndpoint: '<another-browser-ws-endpont>'
});
// ... drive automation ...
</script>
Way 2: Use an API
I will use express for a minimal setup. Consider your scrape function is exported to a file called scrape.js and you have the following index.js file.
const express = require('express')
const scrape= require('./scrape')
const app = express()
app.get('/', function (req, res) {
scrape().then(data=>res.send({data}))
})
app.listen(8080)
This will launch a express API on the port 8080.
Now if you run it with node index.js on a server, you can call it from any mobile/web app.
Helpful Resources
I had some fun with puppeteer and webpack,
playground-react-puppeteer
playground-electron-react-puppeteer-example
To keep the api running, you will need to learn a bit about backend and how to keep the server alive etc. See these links for full understanding of creating the server and more,
Official link to puppeteer-web
Puppeteer with docker
Docker with XVFB and Puppeteer
Puppeteer with chrome extension
Puppeteer with local wsEndpoint
Avoid memory leak on server

How to pipe a data stream into a function that's assigned to a constant?

The example below from Google works but it uses pipe. For my situation I'm listening to a websocket that sends packets in 20ms increments and from what I've been able to find there is no way to pipe that data into a function.
The first argument that must be passed on initialization is a config object. After that only data is accepted. So I set the function up as a variable, then pass the config. But I can't figure out how to pass the stream of data into it afterwards. How do I pass data into recognizeStream without using pipe? Or is there are a way to use pipe with websocket
I can vouch for this setup working by reading and writing from temporary files at certain intervals but this has the obvious disadvantages of 1) all of that overhead and 2) most importantly, not being a real-time stream.
There are two solutions that I think would work but have not been able to implement:
There is some way to setup a pipe from websocket (This is ideal)
Simultaneously writing the data to a file while at the same time reading it back using createReadStream from a file using some implementation of fs (This seems like a minefield of problems)
tl;dr I need to send the stream of data from a websocket into a function assigned to a const as the data comes in.
Example setup from Google Docs
const Speech = require('#google-cloud/speech');
// Instantiates a client
const speech = Speech();
// The encoding of the audio file, e.g. 'LINEAR16'
const encoding = 'LINEAR16';
// The sample rate of the audio file, e.g. 16000
const sampleRate = 16000;
const request = {
config: {
encoding: encoding,
sampleRate: sampleRate
}
};
const recognizeStream = speech.createRecognizeStream(request)
.on('error', console.error)
.on('data', (data) => process.stdout.write(data.results));
// Start recording and send the microphone input to the Speech API
record.start({
sampleRate: sampleRate,
threshold: 0
}).pipe(recognizeStream);
Websocket setup
const WebSocketServer = require('websocket').server
wsServer.on('connect', (connection) => {
connection.on('message', (message) => {
if (message.type === 'utf8') {
console.log(message.utf8Data)
} else if (message.type === 'binary') {
// send message.binaryData to recognizeStream
}
})
})
You should just be able to do:
recognizeStream.write(message.binaryData)

Categories

Resources