React Native - Unable to Upload Image Via Axios Post (Android) - javascript

I'm unable to upload an image from my React Native application in Android (iOS works fine). On attempting to upload, I'm receiving the following error:
Error: Network Error
at createError (buildURL.js:33)
at XMLHttpRequest.handleError (xhr.js:144)
at XMLHttpRequest.dispatchEvent (event-target.js:172)
at XMLHttpRequest.setReadyState (XMLHttpRequest.js:576)
at XMLHttpRequest.__didCompleteResponse (XMLHttpRequest.js:392)
at XMLHttpRequest.js:505
at RCTDeviceEventEmitter.emit (EventEmitter.js:190)
at MessageQueue.__callFunction (MessageQueue.js:344)
at MessageQueue.js:107
at MessageQueue.__guard (MessageQueue.js:291)
All other requests work, connection is configured to my local static IP address, as I'm testing on a physical device, rather than in a simulator.
I've looked at a number of solutions already:
https://github.com/facebook/react-native/issues/9506
Suggests to add a type filed to data, which I had already (detailed below)
https://github.com/facebook/react-native/issues/10404
Suggests to use IP instead of localhost, which I have been doing since the start
Here is the code that handles this:
ProfileImage.js (wrapper for react-native-image-picker):
async onHandleResizedImageUri(image) {
var data = new FormData();
data.append("profile_image", {
uri: image.path,
name: image.name,
type: "image/jpeg"
});
let imageUploadSuccess = await global.api.callPostImage("/api/profile-image", data);
Api.js
async callPostImage(url, data) {
try {
const settings = {
baseURL: config.api_host,
timeout: 1000,
headers: {
"content-type": "multipart/form-data"
}
};
const response = await this.axios.post(url, data, settings);
return response;
} catch (error) {
console.log(error);
}
}
Also, this issue is happening in both debug and release mode, and for multiple servers (local, staging and production). All other network requests work fine, but this one will not complete. Anyone have any suggestions?
Device information:
Samsung Galaxy A5 (2017)
Model SMA520W
Version 8.0.0

Related

Netflify lambda function working locally but not in production

I'm trying to use netlify and its lambda function feature to run a node function . Based on https://css-tricks.com/using-netlify-forms-and-netlify-functions-to-build-an-email-sign-up-widget/ , I have in my functions/submission-created.js:
const https = require("https");
exports.handler = async event => {
const email = JSON.parse(event.body).payload.data.EMAIL
const asking = JSON.parse(event.body).payload.data.ASKING
var formData = {
'email': email,
'first_name': '',
'last_name': asking,
'lists[]': 'NUM'
};
var encoded = Object.entries(formData).map(([k, v]) => `${k}=${encodeURIComponent(v)}`).join("&");
var endpoint = 'https://api.sendfox.com/contacts/?' + encoded;
const data = JSON.stringify(formData);
const options = {
method: 'POST',
connection: 'keep-alive',
headers: {
'Authorization': 'Bearer hhhhh',
'Content-Type': 'application/json',
},
'content-length': data.length,
};
console.log(email);
const req = https.request(endpoint, options, (res) => {
console.log('statusCode:', res.statusCode);
console.log('headers:', res.headers);
res.on('data', (d) => {
console.log(d);
});
});
req.on('error', (e) => {
console.error(e);
});
req.write(data);
req.end();
return {
statusCode: 200,
body: data
};
}
This works as expected when I run it locally with netlify dev, but when pushed to the github repo used by netlify to build the site, it does not work in production. How can I fix this?
The package structure looks like the screenshot:
EDIT:
the netlify.toml :
[build]
functions = "./functions
No errors. The output on sites function tab is:
8:57:43 PM: 2020-12-08T01:57:43.384Z undefined INFO to here
8:57:43 PM: 2020-12-08T01:57:43.390Z 8ca26edc-1f20-4c20-b038-
79ecdf206d92 INFO yt#ghj.org
8:57:43 PM: 2020-12-08T01:57:43.390Z 8ca26edc-1f20-4c20-b038-
79ecdf206d92 INFO 999
8:57:43 PM: 2020-12-08T01:57:43.390Z 8ca26edc-1f20-4c20-b038-
79ecdf206d92 INFO yt#ghj.org
8:57:43 PM: Duration: 39.71 ms Memory Usage: 69 MB Init Duration:
176.22 ms
As noted, the information you give are insufficient regarding what has gone wrong. However, I will try to answer based on what I have.
In the tutorial, I noticed the usage of dotenv package.
The package is used in order to setup different configuration in different environments.
It utilizes different production files related to the environments, allowing you for example to setup a .env file for local development, and a different one (e.g.,.env.production) for your production.
Based on the setup, variables setup in the respective .env file utilized on each environment, are loaded in the process.env object.
Now, on your tutorial, I noticed that you have the loading of crucial variables being loaded from .env such as EMAIL_TOKEN. I suspect that your setup expects a separate dotenv file for production - and by not finding it, it loads the parameters required as empty, silently. Please revise what environment it loads and what your configuration is at your environments, respectively.
Also, consider the following tutorial for working with dotenv vars.
This could be because of incorrect settings on the API Gateway. Check what all traffic is allowed. Sometimes you may have just allowed TCP traffic. Change it to all traffic.
I could also be that you have a malformed lambda response. Check this AWS documentation
Resolve malformed 502 Api gateway

Electron upload with progress

I have an Electron app which is able to upload very big files to the server via HTTP in renderer process without user input. I decided to use axios as my HTTP client and it was able to retrieve upload progress but with this I met few problems.
Browser's supported js and Node.js aren't "friendly" with each other in some moments. I used fs.createReadStream function to get the file but axios does not understand what ReadStream object is and I can't pipe (there are several topics on their GitHub issue tab but nothing was done with that till now) this stream to FormData (which I should place my file in).
I ended up using fs.readFileSync and then form-data module with its getBuffer() method but now my file is loaded entirely in the memory before upload and with how big my files are it kills Electron process.
Googling I found out about request library which in-fact is able to pipe a stream to request but it's deprecated, not supported anymore and apparently I can't get upload progress from it.
I'm running out of options. How do you upload files with Electron without user input (so without file input) not loading them in the memory upfront?
P.S. on form-data github page there is a piece of code explaining how to upload a file stream with axios but it doesn't work, nothing is sent and downgrading the library as one issue topic suggested didn't help either...
const form = new FormData();
const stream = fs.createReadStream(PATH_TO_FILE);
form.append('image', stream);
// In Node.js environment you need to set boundary in the header field 'Content-Type' by calling method `getHeaders`
const formHeaders = form.getHeaders();
axios.post('http://example.com', form, {
headers: {
...formHeaders,
},
})
.then(response => response)
.catch(error => error)
I was able to solve this and I hope it will help anyone facing the same problem.
Since request is deprecated I looked up for alternatives and found got.js for NodeJS HTTP requests. It has support of Stream, fs.ReadStream etc.
You will need form-data as well, it allows to put streams inside FormData and assign it to a key.
The following code solved my question:
import fs from 'fs'
import got from 'got'
import FormData from 'form-data'
const stream = fs.createReadStream('some_path')
// NOT native form data
const formData = new FormData()
formData.append('file', stream, 'filename');
try {
const res = await got.post('https://my_link.com/upload', {
body: formData,
headers: {
...formData.getHeaders() // sets the boundary and Content-Type header
}
}).on('uploadProgress', progress => {
// here we get our upload progress, progress.percent is a float number from 0 to 1
console.log(Math.round(progress.percent * 100))
});
if (res.statusCode === 200) {
// upload success
} else {
// error handler
}
} catch (e) {
console.log(e);
}
Works perfectly in Electron renderer process!

Read/Write and store data internelly in a Local App (no server) with JavaScript

So I am making a local App using Javascript , React and Electron and I want it to be able to work just fine without internet.
I can't use 'localStorage' because the data might get deleted if the user deletes the cache.
I tried reading/writing using differant Modules, none of them worked mostly because of CROS. Using XMLHTTPrequests and Ajax doesn't work either and am running out of time.
When I use them on the test server, they return the index.html for the mainpage (They can at least access that ... and still they can't read the data) but when I try it on the build I get CORS the error.
My Idea for now is to enable CORS on my webpage since I have no worries about security : The App will run ONLY offline so there is no danger.
But After many hours...I didn't find a solution to do it on the client side.
If anyone has an idea or suggestion I would be grateful.
I tried : fs,FileReader,FileSaver, $.ajax,XMLHTTPrequests
//using $ajax
var test = $.ajax({
crossDomain:true,
type: 'GET',
url:'../data/DefaultCategorie.txt',
contentType: 'text/plain',
success: function(data){
console.log(data);
},
error: function(){
alert('failed');
},
})
//using fs
fs.readFile('../data/DefaultCategorie.txt', 'utf8', (err, data) => {
if (err) {
console.log("Failed");
throw err
}
console.log(data);
fs.close(data, (err) => {
if (err) throw err;
});
});
This article covers the 3 most common ways to store user data: How to store user data in Electron
The Electron API for appDatadoes what you want. It is very easy to use.
From the above article:
const userDataPath = (electron.app || electron.remote.app).getPath('userData');
this.path = path.join(userDataPath, opts.configName + '.json')
this.data = parseDataFile(this.path, opts.defaults);
function parseDataFile(filePath, defaults) {
try {
return JSON.parse(fs.readFileSync(filePath));
} catch(error) {
// if there was some kind of error, return the passed in defaults instead.
return defaults;
}
}
Docs
app.getPath(name)
name String
Returns String - A path to a special directory or file associated with
name. On failure, an Error is thrown.
You can request the following paths by the name:
appData - Per-user application data directory, which by default points to:
%APPDATA% on Windows
$XDG_CONFIG_HOME or ~/.config on Linux
~/Library/Application Support on macOS
userData - The directory for storing your app's configuration files,
which by default it is the appData directory appended with your app's
name.

Piping (big) CSV download from Cloudant noSQL database through Node.js in IBM Cloud not working in production environment

I developed an application in Node.js and I need to download raw data from a noSQL database in the cloud in CSV format (Cloudant noSQL - IBM Cloud). Cloudant allows me to download all data from a database through an API. I want the user to be able to download this same file but through my Node.js App. What I did is, pipe the response from the database API to the client's response. This works just fine when I do it locally, but when I upload the application to the IBM Cloud and try to download the same file (40 mb) it is never downloaded (but it does work with small files, ej. 5 mb).
I've tried 3 different approaches, one with request module, and two other with the https module.
1. Request Module
request({
url: database.credentials.url + path,
method: 'GET'
}).pipe(res);
2. Https Module (1st try)
res.setHeader('content-Type', 'text/csv');
res.setHeader('transfer-encoding','chunked');
res.setHeader('strict-transport-security','max-age=31536000');
https.get(database.credentials.url + path, (csv_res) => {
console.log('Download raw data db headers');
console.log(csv_res.headers);
csv_res.on('data', (d) => {
res.write(d);
process.stdout.write(".");
});
csv_res.on('end', () => {
res.end();
});
}).on('error', (e) => {
console.error(e);
});
3. Https Module (2nd try)
var options = {
hostname: database.credentials.host,
port: 443,
path: path,
method: 'GET',
headers: {
'Authorization': 'Basic ' + new Buffer(database.credentials.username + ':' + database.credentials.password).toString('base64')
}
};
var proxy = https.request(options, function (csv_res) {
console.log(csv_res.headers)
res.writeHead(csv_res.statusCode, csv_res.headers)
csv_res.pipe(res, {
end: true
}).on('error', (e) => {
console.log("ERROR piping to res: " + e)
})
});
req.pipe(proxy, {
end: true
}).on('error', (e) => {
console.log("ERROR piping from req" + e)
})
When I run the app locally, the file is downloaded but when I do it in the cloud, the file is never downloaded and after a while the browser shows the file with Network error. Why is this happening?
check the monitoring dashboard while you do this. You might be getting rate limited. I'm also posting a utility that has export functionality for cloudant, https://github.com/glynnbird/couchimport.
If you have have further questions or concerns, just drop us a line through IBM Cloud support and we can help you out!

Cordova file transfer plugin using https not working for windows

I am currently using cordova file transfer plugin to download a file and save it locally. I am using https and the server certificate is installed in the device.
It works on IOS and android but it does not work on Windows.
I was able to debug up to plugin's code and it would error in this part with the error message of 'A security problem occurred' and would now return to the application with the FTErr.CONNECTION_ERR:
var downloadOperation = download.startAsync();
// update internal TransferOperation object with newly created promise
fileTransferOps[downloadId].promise = downloadOperation;
downloadOperation.then(function () {...}, function(error)
{
if (error.message === 'Canceled') {
resolve(new FTErr(FTErr.ABORT_ERR, source, target, null, null, error));
} else if (error && error.number === HTTP_E_STATUS_NOT_MODIFIED) {
resolve(new FTErr(FTErr.NOT_MODIFIED_ERR, source, target, 304, null, error));
} else {
// in the other way, try to get response property
var response = download.getResponseInformation();
if (!response) {
resolve(new FTErr(FTErr.CONNECTION_ERR, source, target));
}
}
}
This is my code in cordova:
fileTransfer.download(uri, fileURL, function (entry) {
console.log('file download successful');
}, function (errorMsg) {
console.log(errorMsg);
}, false, {
headers: {"Authorization": authToken},
});`
Is there anything I am missing to make this work in windows?
Just to add, download using http was working.
Https certificate is also valid since ajax get worked.
Thanks!
From MSDN, client certificate is not supported by the BackgroundDownloader.
https://social.msdn.microsoft.com/Forums/en-US/c25146d2-c051-4367-9745-2b526618dc35/winjsxhr-and-httpclient-work-with-client-certificates-backgrounddownloader-doesnt?forum=winappswithhtml5
So I guess the only way for now would be to create a plugin that uses HttpClient.

Categories

Resources