recently I got an issue with uploading documents functionality in Expo CLI - React native, the issue is the file is not delivering to the backend like FormData() it arrives in a weird array, not like the website at all, I got the same endpoint connected to React JS project and it works fine and the browser delivers the documents correctly,
But not sure what is wrong with FromData here,
I also tried react-native-fs but it's not compatible with Expo
and here is my code
let formData = new FormData();
formData.append('file', doc);
let token = this.props.user.token;
let header = { headers: { 'Accept': '*', 'Authorization': 'Bearer '+token, 'Content-Type': 'multipart/form-data' } };
let res = await axios.post(BackendURL+'/porta/files/upload/'+document.projectDocumentId, formData, header);
if(res.data.success === true){
alert('File uploaded successfully');
this.props.ReloadData(true);
}
console.log(doc);
Issue has been solved,
the problem where in the header, i removed the content type & accept like this
let formData = new FormData();
formData.append('file', {
name: doc.name,
uri: doc.uri,
type: "application/pdf"
});
this.props.loadingStatus(true);
let token = this.props.user.token;
let header = { headers: { 'Authorization': 'Bearer '+token } };
let res = await axios.post(BackendURL+'/portal/files/upload/'+document.projectDocumentId, formData, header);
Related
I have a React project that I run with npm start and this code gets 401 Error from the second fetch (the first one is ok). It runs fine returning 200 only with node, like in "node App.js".
So what would I need to do to run my React project getting 200 response? Why is there this difference between npm and node to this request response?
const clientID = <ClientID>
const clientSecret = <ClientSecret>
const encode = Buffer.from(`${clientID}:${clientSecret}`, 'utf8').toString('base64')
const requestOptions = {
method: 'POST',
headers: { 'Content-Type': 'application/x-www-form-urlencoded',
'Authorization': `Basic ${encode}`,
},
};
fetch("https://auth-nato.auth.us-east-1.amazoncognito.com/oauth2/token?grant_type=client_credentials", requestOptions)
.then(response => { return response.json() })
.then(data => {
const requestOptions2 = {
method: 'POST',
mode: 'no-cors',
headers: { 'Content-Type': 'application/json',
'Authorization': `Bearer ${data.access_token}`
},
body: '{"username":"Ana", "password":"test123","user_id":"ana#email.com"}'
};
fetch('https://j1r07lanr6.execute-api.sa-east-1.amazonaws.com/v1/register', requestOptions2)
.then(response => {console.log(response)});
})
Buffer - is not presented in the browser's javascript.
Instead of
const encode = Buffer.from(`${clientID}:${clientSecret}`, 'utf8').toString('base64')
use just
const encode = btoa(`${clientID}:${clientSecret}`);
Read more about base64 encoding on MDN.
I found out it was a CORS issue that needed to be set correctly on the back-end. My workaround was disabling chrome web security and removing "mode: no-cors".
I've tried adding "Access-Control-Allow-Origin":"http://localhost:3000" to headers but it doesn't work.
I am using an existing API call to send a file to our cloud provider via Nodejs. I have seen several different methods of doing this online, but figured I would stick to using "fetch" as most of my other API calls have been using this as well. Presently, I keep getting 500 internal server error and am not sure why? My best conclusion is that I am not sending the file properly or one of my pieces of formdata are not resolving correctly. See the below code:
const fetch = require("node-fetch");
const formData = require("form-data");
const fs = require("fs");
var filePath = "PATH TO MY FILE ON SERVER WITH FILE NAME";
var accessToken = "Bearer <ACCESS TOKEN>;
var url = '<API URL TO CLOUD PROVIDER>';
var headers = {
'Content-Type': 'multipart/form-data',
'Accept': 'application/json',
'Authorization': accessToken
};
const form = new formData();
const buffer = fs.readFileSync(filePath);
const apiName = "MY_FILE_NAME";
form.append("Content-Type", "application/octect-stream");
form.append("file", filePath);
console.log(form);
fetch(url, { method: 'POST', headers: headers, body: form })
.then(response => response.json())
.then(data => {
console.log(data)
})
.catch(err => {
console.log(err)
});
This my first time attempting something like this so I am next to certain I am missing something. Any help with getting me in the right direction is appreciated.
So the issue was exactly what I mentioned above. The code was not uploading the file I specified. I finally figured out why and below is the modified code which will grab the file and upload to our cloud service provide:
const fetch = require("node-fetch");
const formData = require("form-data");
const fs = require("fs");
var apiName = process.env['API_PATH'];
var accessToken = "Bearer" +" "+ process.env['BEARER_TOKEN'];
var url = process.env['apiEnv'] +"/" +"archive";
var headers = {
'Accept': 'application/json',
'Authorization': accessToken,
};
const form = new formData();
const buffer = fs.readFileSync(apiName);
const uploadAPI = function uploadAPI() {
form.append("Content-Type", "application/octet-stream");
form.append('file', buffer);
fetch(url, {method: 'POST', headers: headers, body: form})
.then(data => {
console.log(data)
})
.catch(err => {
console.log(err)
});
};
uploadAPI();
Being new to Javascript/Nodejs I wasn't really sure what the "buffer" variable did. After finally figuring it out I realized I was adding too many body form params to the request and the file was not being picked up and sent to the provider. All code above is using custom variables, but if for whatever reason someone wants to use it, then simply replace the custom variables with your own....Thanks again for any and all assistance....
import fs from 'fs'
import FormData from 'FormData';
const fileStream = fs.createReadStream('./file.zip');
const form = new FormData();
form.append('key', fileStream, 'file.zip');
const response = await axios.post(url, form, {
headers: {
...form.getHeaders(),
},
});
I'm trying to send an image through axios POST request. The request is going through, but the image is not uploading.
Here is my code,
const screenshotPath = path.join(os.tmpdir(), 'screenshot.png');
var bodyFormData = new FormData();
//bodyFormData.append('uploadedFile', screenshotPath);
bodyFormData.append('uploadedFile', fs.createReadStream(screenshotPath));
axios({
method: 'post',
url: url,
data: bodyFormData,
config: {
headers: {
'Content-Type': 'multipart/form-data',
Authorization: 'Bearer ' + token
}
}
})
Is it because of the filename path ?
this is my screenshotPath
C:\Users\oem\AppData\Local\Temp\screenshot.png
You are using createReadStream function from the File System library of Node. But Node is running on server-side and here you are working with a react application which is running on client side.
Please check MDN documentation on how to upload files from front-end applications.
I am trying to upload a file from a react front end to a C# backend. I am using drop zone to get the file and then I call an api helper to post the file but I am getting different errors when I try different things. I am unsure exactly what the headers should be and exactly what I should send but I get two distinct errors. If I do not set the content-type I get 415 (Unsupported Media Type) error. If I do specify content type as multipart/form-data I get a 500 internal server error. I get the same error when the content-type is application/json. The url is being past in and I am certain it is correct. I am unsure if the file should be appended as file[0][0] as I have done or as file[0] as it is an array but I believe it should be the first. Any suggestions welcome :) Here is my api post helper code:
export const uploadAdminFile = (file, path, method = 'POST', resource =
config.defaultResource) => {
const url = createUrl(resource, path);
const data = new FormData();
data.append('file', file[0][0]);
data.append('filename', file[0][0].name);
const request = accessToken =>
fetch(
url,
{
method,
mode: 'cors',
withCredentials: true,
processData: false,
headers: {
Accept: 'application/json',
'Content-Type': 'application/json', //'multipart/form-data',
Authorization: `Bearer ${accessToken}`,
},
body: data,
})
.then(res => res.json())
.then(success => console.log('API HELPER: file upload success: ', success)
.catch(err => console.log('API HELPER: error during file upload: ', err)));
return sendRequest(request, resource);
};
Thanks for the help and suggestions, it turned out to be a backend issue but even still I learned a lot in the process. I will post my working code here in case anyone comes across this and finds it useful.
export const uploadAdminFile = (file, path, resource=config.defaultResource) => {
const url = createUrl(resource, path);
const formData = new FormData();
formData.append('file', file[0][0]);
formData.append('filename', file[0][0].name);
const request = accessToken =>
fetch(url,
{
method: 'POST',
headers: {
Accept: 'application/json',
Authorization: `Bearer ${accessToken}`,
},
body: formData,
});
return sendRequest(request, resource);
};
As mentioned, the file name does not need to be sent separately and count be omitted. I am indexing the file this way because I get it from dropzone as an array and I only want a single file (the first one in the array). I hope this helps someone else out and here is a link to the mdn fetch docs (good information) and a good article on using fetch and formData.
I am trying to upload files to Google Drive in Angular 2. So far I am able to upload files, but without title and they are "Untitled"
Here is code to do that:
gDriveUploader(file): Promise<any> {
let authToken = tokenHere;
const url = `https://www.googleapis.com/upload/drive/v2/files/`
let formData:FormData = new FormData();
formData.append('title', file, file.name);
let headers = new Headers({
'Authorization': 'Bearer ' + authToken
});
headers.append('Accept', file.type);
let options = new RequestOptions ({
headers: headers,
});
console.log('OPTIONS: ', options)
return this.http.post(`${url}`, formData, options)
.toPromise()
.then(response => response.json())
.catch(e=>console.log(e));
}
I know, that in order to send metadata with file, I have to add this metadata to Request body and use at multipart or resumable upload types. But here I faced issue after issue and just can't make it properly.
I completely messed up. Here is on of my approaches with resumable upload type:
gDriveUploader(file): Promise<any> {
let authToken = token;
const url = `https://www.googleapis.com/upload/drive/v3/files?uploadType=resumable`
console.log('FILE TO UPLOAD: ', file)
let formData:FormData = new FormData();
formData.append('name', file, file.name);
let headers = new Headers({
'Authorization': 'Bearer ' + authToken,
'Content-Type': 'application/json; charset=UTF-8', //if remove "Bad Content" Error
//'Content-Length': file.size, not sure if this one right?
});
let options = new RequestOptions ({
headers: headers,
});
return this.http.post(`${url}`, formData, options)
.toPromise()
.then(response => response.json())
.catch(e=>console.log(e));
}
that's not only two of my approaches...
According to Drive API for resumable upload:
POST https://www.googleapis.com/drive/v3/files?uploadType=resumable
HTTP/1.1
Authorization: Bearer [YOUR_AUTH_TOKEN]
Content-Length: 38
Content-Type: application/json; charset=UTF-8
X-Upload-Content-Type: image/jpeg
X-Upload-Content-Length: 2000000
What is Content-Length: 38? it's file length and I can just use file.size?
With multipart I can't figure out how to add those boundary separator in the request.
I saw some Q and A, that multipart were not supported by Angular, but that was 1-2 year ago. What about now?
Can I somehow use resumable upload to GDrive with additional file metadata using standard Angular features?
So. A bit more research on how API works. I came up with the following solution for resumable file upload. Main Idea, that first time I have to make a request and "set metadata" for my file and get response with the link, where to upload the file. And this link came as one of the response header called location.
Here is fully working code. Just pass File object to the first function.
I just quickly made 2 functions for this. First one will set metadata (just name) and call second function to upload just binary data.
gDriveUploader(file): Promise<any> {
let authToken = token
const url = `https://www.googleapis.com/upload/drive/v3/files?uploadType=resumable`
let headers = new Headers({
'Authorization': 'Bearer ' + authToken,
'Content-Type': 'application/json; charset=UTF-8',
});
let options = new RequestOptions ({
headers: headers,
});
return this.http.post(`${url}`, {name: file.fullName}, options) //just set the name
.toPromise()
.then(response => this.gDriveUploadFile(file, response.headers.get('location'))) //call second function to upload `file` to proper URI from response
.then(response => {
let id = response.json().id //parse id of uploaded file
let resp = {fileName: file.fullName, fileType: file.fileType, fileSize: file.size, fileId: id} //create an object with file file properties, if you need that
return resp // return object back to function that called this service
})
.catch(e=>console.log(e));
}
Second function to upload data:
gDriveUploadFile(file, url): Promise<any> { //file and url we got from first func
let authToken = token
let headers = new Headers({
'Authorization': 'Bearer ' + authToken,
'Content-Type': 'application/json; charset=UTF-8',
'X-Upload-Content-Type': file.type
});
let options = new RequestOptions ({
headers: headers,
});
return this.http.post(`${url}`, file, options) //call proper resumable upload endpoint and pass just file as body
.toPromise()
}
Maybe solution not ideal, so far I do not handle errors here and do not use resumable features like uploading by chunks, just upload file at once. But hopefully if someone else stuck with GDrive uploading can get an idea.