how to write file with buffer array using put method? - javascript

Is there any way to write file using buffer array and content-type with put method?
requestify.request('some url', {
method: 'PUT',
body: buffArray, //need modifications here
headers: {
'Content-Type': res_file.headers['content-type']
}
}).then(function (res) {
console.log(res);
})
I could send the data but file not storing in proper way.
working Java code
httpcon.setRequestMethod("PUT");
httpcon.setReadTimeout(100000);
httpcon.setDoOutput(true);
httpcon.setRequestProperty("Content-Type", conenttype);
httpcon.connect();
OutputStream os = httpcon.getOutputStream();
os.write(in.toByteArray(), 0, in.size());
responceCode = httpcon.getResponseCode();
httpcon.disconnect();

My personal advice here is to use the builtin http or https package from Node.JS.
Why? Because you want to write and read binary files that might be large enough to give you problems, and as for what I've tested with requestify, it will give you problems when using binary responses (It stringifies them!).
You can simply use streams which will save you lots of headaches.
You can test it using this, for example:
const fs = require('fs');
const http = require('https');
const req = http.request({
host: 'raw.githubusercontent.com',
path: '/smooth-code/svgr/master/resources/svgr-logo.png',
method: 'GET'
}, res => {
res.pipe(fs.createWriteStream('test.png'));
});
req.end();
and adapted to your provided code:
const fs = require('fs');
const http = require('https');
const req = http.request({
host: 'some-host',
path: '/some/path',
method: 'PUT',
headers: {
'Content-Type': res_file.headers['content-type']
}
}, res => {
res.pipe(fs.createWriteStream('your-output-file.blob'));
});
// This part: If comes from HDD or from another request, I would recommend using .pipe also
req.write(buffArray);
req.end();
Further info:
http package https://nodejs.org/api/http.html
fs package https://nodejs.org/api/fs.html

Related

multiple asynchronous post request with files in nodejs

I've tried to send a bunch of post requests with file upload in nodejs.
using axios.post, I could make a single request easily. But I got an error when trying send multiple asynchronous requests.
Based on the axios document, it uses axios.all([ axios.get(), axios.get(), ...]) to make async requests at time.
If I sent my code, the error says:
"Error: Request failed with status code 500 ~ "
. This error is the same when I send a request without file upload. So I guess my code doesn't attach a file when I send async request.
Please advise me what I am missing.
My code is below:
var axios = require('axios');
var FormData = require('form-data');
var fs = require('fs');
var data = new FormData();
data.append('files', fs.createReadStream('./liscense.jpg'));
var config = {
method: 'post',
url: 'https://domainname/scan/id',
headers: {
...data.getHeaders()
},
data : data
};
axios
.all([axios(config), axios(config)])
.then(
axios.spread((res1, res2) => {
console.log(res1.data, res2.data);
})
)
.catch((error) => {
console.log(error);
});
Your problem is that you are sending a empty stream,
There is an "_streams" array in your form data that contains the stream of your "liscense.jpg" file, and when you POST the first request to your target host, this stream will be empty and the stream of your other requests is empty, so the file does not reach your destination.
In short, this code only send your file once in first request, and other requests do not include your file/files.
you can try this:
const axios = require('axios');
const FormData = require('form-data');
const fs = require('fs');
function postRequest() {
const data = new FormData();
data.append('files', fs.createReadStream('./liscense.jpg'));
const config = {
method: 'post',
url: 'https://domainname/scan/id',
headers: {
...data.getHeaders()
},
data: data
};
return config;
}
axios
.all([axios(postRequest()), axios(postRequest())])
.then(
axios.spread((res1, res2) => {
console.log(res1.data, res2.data);
})
)
.catch((error) => {
console.log(error);
});

CURL Command to NodeJS - PUT Request OctetStream

I need to make this curl command into a nodejs request, either using fetch, request or axios networking libraries
The request needs to PUT the file data to the URL
curl -v -H "Content-Type:application/octet-stream" --upload-file C:/Users/deanv/Pictures/test3.mp4 "https://www.example.com/file/upload"
I've tried using CurlConverter to convert my curl command to node code, but this doesn't work as I cant add the file data.
var fetch = require('node-fetch');
fetch('https://www.example.com/file/upload', {
method: 'PUT',
headers: {
'Content-Type': 'application/octet-stream'
}
});
All help appreciated. Thanks in advance.
Try this:
// setup modules
const fetch = require('node-fetch');
const FormData = require('form-data');
const fs = require('fs');
const path = require('path');
// setup paths
const pathToFile = 'C:/Users/deanv/Pictures/test3.mp4';
const uploadUrl = 'https://www.example.com/file/upload';
// create form, 'content-type' header will be multipart/form-data
const form = new FormData();
// read the file as stream
const fileStream = fs.createReadStream(path.resolve(pathToFile));
// add the file to the form
form.append('my_file', fileStream);
fetch(uploadUrl, {
method: 'PUT',
body: form
}).then((res) => {
console.log('done: ', res.status);
return res.text();
})
.then((res) => {
console.log('raw response: ', res);
})
.catch(err => {
console.log('err', err);
});

How to call Apollo-Graphql Upload Mutation from Node.js via Axios Request

I have a working Apollo-Graphql Node.js server running with express middleware. Queries and Mutations, including file upload mutation, work fine from connecting front end clients and when called from functions run in Node and passed via axios requests -- except for the file upload mutations.
I've tested using the same query and file paths in Firecamp, and tried variations of passing and checking the file path / doing my best to confirm that the directory structure is getting parsed accurately. This is the error code returned with the axios response (I broke up the response for config.data from console output):
response: {
status: 400,
statusText: 'Bad Request',
headers: {
'x-powered-by': 'Express',
'access-control-allow-origin': '*',
'content-type': 'application/json; charset=utf-8',
'content-length': '1424',
etag: 'W/"590-jNRKeEwYD1b3Cxa/bjf3qp7npHg"',
date: 'Fri, 11 Dec 2020 22:36:12 GMT',
connection: 'close'
},
config: {
url: 'http://localhost:4002/graphql',
method: 'post',
data: '{
"query":"mutation singleUpload($file: Upload!) {
singleUpload(file: $file) {
filename
mimetype
encoding
}
}",
"variables":{"file":"../bulkImports/testPenThumbnail.png"}
}',`
The query definition and function call to axios:
const UPLOAD_FILE = gql`
mutation singleUpload($file: Upload!) {
singleUpload(file: $file) {
filename
mimetype
encoding
}
}
`
export function uploadFile(endpoint) {
const file = '../bulkImports/testPenThumbnail.png';
axios.post(endpoint, {
query: print(UPLOAD_FILE),
variables: { file } })
.then(res => console.dir(res.data))
.catch(err => console.error(err));
}
And the resolver for singleUpload
singleUpload(parent, args) {
return args.file.then(file => {
const { createReadStream, filename, mimetype, encoding } = file;
const stream = createReadStream();
const pathName = join(__dirname, `../../testUploads/${filename}`);
stream.pipe(createWriteStream(pathName));
return {
filename,
mimetype,
encoding,
};
});
}
From other errors/debugging along the way, my best guess is that the upload mutation is only seeing the file path as an ordinary String and not parsing it as an Upload scalar -- and that I should be looking at using the fs module to send more in the way of file object data/stream? I've tried a few things using fs methods, but node/back-end is still pretty new to me and I'm not really sure if I'm even barking up the right tree for how the Upload scalar is constructed.
Of course I'm happy to post any more config or error details that would help -- and thanks in advance to everyone who can help me make sense of this or improve the code below!
(oh, and the intended use-case for calling this from a server will be for bulk uploading records to populate a new db collection; besides just trying to learn more about back-end/node/axios/graphql basics...)

Uploading File via API Using NodeJS 'fetch'

I am using an existing API call to send a file to our cloud provider via Nodejs. I have seen several different methods of doing this online, but figured I would stick to using "fetch" as most of my other API calls have been using this as well. Presently, I keep getting 500 internal server error and am not sure why? My best conclusion is that I am not sending the file properly or one of my pieces of formdata are not resolving correctly. See the below code:
const fetch = require("node-fetch");
const formData = require("form-data");
const fs = require("fs");
var filePath = "PATH TO MY FILE ON SERVER WITH FILE NAME";
var accessToken = "Bearer <ACCESS TOKEN>;
var url = '<API URL TO CLOUD PROVIDER>';
var headers = {
'Content-Type': 'multipart/form-data',
'Accept': 'application/json',
'Authorization': accessToken
};
const form = new formData();
const buffer = fs.readFileSync(filePath);
const apiName = "MY_FILE_NAME";
form.append("Content-Type", "application/octect-stream");
form.append("file", filePath);
console.log(form);
fetch(url, { method: 'POST', headers: headers, body: form })
.then(response => response.json())
.then(data => {
console.log(data)
})
.catch(err => {
console.log(err)
});
This my first time attempting something like this so I am next to certain I am missing something. Any help with getting me in the right direction is appreciated.
So the issue was exactly what I mentioned above. The code was not uploading the file I specified. I finally figured out why and below is the modified code which will grab the file and upload to our cloud service provide:
const fetch = require("node-fetch");
const formData = require("form-data");
const fs = require("fs");
var apiName = process.env['API_PATH'];
var accessToken = "Bearer" +" "+ process.env['BEARER_TOKEN'];
var url = process.env['apiEnv'] +"/" +"archive";
var headers = {
'Accept': 'application/json',
'Authorization': accessToken,
};
const form = new formData();
const buffer = fs.readFileSync(apiName);
const uploadAPI = function uploadAPI() {
form.append("Content-Type", "application/octet-stream");
form.append('file', buffer);
fetch(url, {method: 'POST', headers: headers, body: form})
.then(data => {
console.log(data)
})
.catch(err => {
console.log(err)
});
};
uploadAPI();
Being new to Javascript/Nodejs I wasn't really sure what the "buffer" variable did. After finally figuring it out I realized I was adding too many body form params to the request and the file was not being picked up and sent to the provider. All code above is using custom variables, but if for whatever reason someone wants to use it, then simply replace the custom variables with your own....Thanks again for any and all assistance....
import fs from 'fs'
import FormData from 'FormData';
const fileStream = fs.createReadStream('./file.zip');
const form = new FormData();
form.append('key', fileStream, 'file.zip');
const response = await axios.post(url, form, {
headers: {
...form.getHeaders(),
},
});

Using rejectUnauthorized with node-fetch in node.js

I currently use request to make http requests in node.js. I had at some point encountered an issue where I was getting errors that indicated UNABLE_TO_GET_ISSUER_CERT_LOCALLY. To get around that it set rejectUnauthorized. My working code with request looks like this:
var url = 'someurl';
var options = {
url: url,
port: 443,
// proxy: process.env.HTTPS_PROXY, -- no need to do this as request honors env vars
headers: {
'User-Agent': 'Mozilla/5.0 (Windows NT 6.3; Trident/7.0; rv:11.0) like Gecko',
'Accept-Language': 'en-us',
'Content-Language': 'en-us'
},
timeout: 0,
encoding: null,
rejectUnauthorized: false // added this to prevent the UNABLE_TO_GET_ISSUER_CERT_LOCALLY error
};
request(options, function (err, resp, body) {
if (err) reject(err);
else resolve(body.toString());
});
I thought I would try switching to the fetch api using async/await and am now trying to use node-fetch to do the same thing. However, when I do the same thing I am back to the UNABLE_TO_GET_ISSUER_CERT_LOCALLY errors. I read that I needed to use a proxy agent and tried using the proxy-agent module but I am still not having any luck.
Based off of the post https://github.com/TooTallNate/node-https-proxy-agent/issues/11 I thought the following would work:
var options = {
headers: {
'User-Agent': 'Mozilla/5.0 (Windows NT 6.3; Trident/7.0; rv:11.0) like Gecko',
'Accept-Language': 'en-us',
'Content-Language': 'en-us'
},
timeout: 0,
encoding: null
};
var proxyOptions = nodeurl.parse(process.env.HTTPS_PROXY);
proxyOptions.rejectUnauthorized = false;
options.agent = new ProxyAgent(proxyOptions);
const resp = await fetch('someurl', options);
return await resp.text();
but I still get the same error. So far the only way I've been able to get around this using node-fetch is to set NODE_TLS_REJECT_UNAUTHORIZED=0 in my environment which I don't really want to do. Can someone help show me how to get rejectUnauthorized to work with node-fetch (presumably using an agent but I don't honestly care how as long as it's specified as part of the request).
This is how I got this to work using rejectUnauthorized and the Fetch API in a Node.js app.
Keep in mind that using rejectUnauthorized is dangerous as it opens you up to potential security risks, as it circumvents a problematic certificate.
const fetch = require("node-fetch");
const https = require('https');
const httpsAgent = new https.Agent({
rejectUnauthorized: false,
});
async function getData() {
const resp = await fetch(
"https://myexampleapi.com/endpoint",
{
agent: httpsAgent,
},
)
const data = await resp.json()
return data
}
Use proxy
You should know that node-https-proxy-agent latest version have a problem and doesn't work with Fetch! You can use older version 3.x and down! And it will work! Otherwise Better you can use the node-tunnel module https://www.npmjs.com/package/tunnel! You can too use the wrapping module proxy-http-agent that is based on node-tunnel https://www.npmjs.com/package/proxy-http-agent! That provide automatic detection of protocol for the proxy! One method for all! And more options and affinity! And both of them support both http and https !
You can see the usage and see a good example of proxy building and setup in this module and repo (check the tests):
https://www.npmjs.com/package/net-proxy
https://github.com/Glitnirian/node-net-proxy#readme
ex:
import { ProxyServer } from 'net-proxy';
import { getProxyHttpAgent } from 'proxy-http-agent';
// ...
// __________ setting the proxy
const proxy = new ProxyServer({
port: proxyPort
});
proxy.server.on('data', (data: any) => { // accessing the server instance
console.log(data);
});
await proxy.awaitStartedListening(); // await server to start
// After server started
// ______________ making the call through the proxy to a server through http:
let proxyUrl = `http://localhost:${proxyPort}`; // Protocol from the proxy is automatically detected
let agent = getProxyHttpAgent({
proxy: proxyUrl,
endServerProtocol: 'http:' // the end server protocol (http://localhost:${localApiServerPort} for example)
});
const response = await fetch(`http://localhost:${localApiServerPort}`, {
method: 'GET',
agent
});
// ___________________ making a call through the proxy to a server through https:
agent = getProxyHttpAgent({
proxy: proxyUrl, // proxy as url string! We can use an object (as tunnel module require too)
rejectUnauthorized: false // <==== here it go
});
const response2 = await fetch(`https://localhost:${localApiHttpsServerPort}`, {
method: 'GET',
agent
});
You can see more examples and details in the doc here:
https://www.npmjs.com/package/proxy-http-agent
And you can too use directly node-tunnel! But the package is just a simple wrapper! That make it more simpler!
Add rejectUnauthorized
For the one that doesn't know well!
As per this thread
https://github.com/node-fetch/node-fetch/issues/15
We use the https.Agent to pass the rejectUnauthorized parameter!
const agent = new https.Agent({
key: fs.readFileSync(`${CERT_PATH}.key`),
cert: fs.readFileSync(`${CERT_PATH}.crt`),
rejectUnauthorized: false
})
A complete example
import https from "https";
const agent = new https.Agent({
rejectUnauthorized: false
});
fetch(myUrl, { agent });
For fetch you can too use an environment variable as follow
process.env.NODE_TLS_REJECT_UNAUTHORIZED = "0";
This way it gonna be set globaly and not per each call! Which may be more appropriate if you are using a constant proxy! For all calls! As when sitting behind the company proxy!
why
By default node fetch! And most of the http requests clients! All use the security and insure a valid ssl Certificate when using https!
To disable this behavior we need to disable that check somehow!
Depending on the libs it may be different!
For fetch that's how it's done!
With http.request! (underlying)
const https = require('https');
const options = {
hostname: 'encrypted.google.com',
port: 443,
path: '/',
method: 'GET',
rejectUnauthorized: false /// <<<== here
};
const req = https.request(options, (res) => {
console.log('statusCode:', res.statusCode);
console.log('headers:', res.headers);
res.on('data', (d) => {
process.stdout.write(d);
});
});
req.on('error', (e) => {
console.error(e);
});
req.end();
check this:
https://nodejs.org/api/https.html#https_https_request_url_options_callback
Also it's part of tls.connect Options
Which you can check here
https://nodejs.org/api/tls.html#tls_tls_connect_options_callback

Categories

Resources