How to pipe a HTTPS get response into an Object - javascript

I transformed my code so that instead of requiring an extra node_modules, I could just use some HTTPS GET requests, the problem is that when I try to pipe /releases/ which is basically a raw JSON file, my code requires it back and issues occur like SyntaxError: Unexpected end of JSON input, because for some reason, when I console.log() the so called JSON array, the end isn't completed with ] or }. So I try to pipe the response into an array, but now I get an error: dest.on isn't a function,
Code:
https.get({hostname: `api.github.com`, path: `/repos/${username}/${reponame}/releases`, headers: {'User-Agent': 'a user agent'}}, async (response) => {
var file = new Array()
response.pipe(file)
response.on('end', async function() { //issue occurs at response.pipe ???
var releases = JSON.parse(fs.readFileSync('./releases.json', 'utf8'))
console.log(releases)
The JSON file that I access from Github looks like: https://api.github.com/repos/davidmerfield/randomColor/releases (random repository)
But, my file (releases.json) looks like this
Edit: I did extensive testing. I used the same JSON file my pkg outputted, read it with fs and so on, and everything seems fine. So the issue is most likely with https / response

I found out how to pipe the HTTP request into an object, instead of piping it into a file. Thanks to this post. I did that and turned the string into a JSON array.
https.get({hostname: `api.github.com`, path: `/repos/${username}/${reponame}/releases`, headers: {'User-Agent': 'agent'}}, async response => {
var str = ''
response.on('data', (data) => {
str += data
})
response.on('end', async function() {
var releases = JSON.parse(str)
//and so on...

You can require JSON files. So, if you need this file, you can do something like:
const releases = require('./releases.json');
You do not need to read it with fs, unless you really want to.

TypeError: dest.on is not a function
This error will be thrown if you try to pipe to non-Writable Stream object. Check here
Which in this case Array is not a Writable Stream. You can create a writable stream using fs.createWriteStream() and pipe the response to it.
https.get(
{ hostname: `api.github.com`, path: `/repos/${username}/${reponame}/releases`, headers: { "User-Agent": "a user agent" } },
async response => {
const writableStreamFile = fs.createWriteStream("./releases.json");
response.pipe(writableStreamFile);
response.on("end", async function() {
var releases = JSON.parse(fs.readFileSync("./releases.json", "utf8"));
console.log(releases);
});
}
);

Related

Force plain text encoding with node https get request [duplicate]

This question already has answers here:
Encoding issue with requesting JSON from StackOverflow API
(2 answers)
Closed 1 year ago.
Problem
Related to Get UTF-8 html content with Node's http.get - but that answer isn't working for me.
I'm trying to call the Stack Overflow API questions endpoint:
https://api.stackexchange.com/2.3/questions?site=stackoverflow&filter=total
Which should return the following JSON response:
{"total":21951385}
Example Code
I'm using the https node module to submit a get request like this:
const getRequest = (url: string) => new Promise((resolve, reject) => {
const options: RequestOptions = {
headers: {
'Accept': 'text/*',
'Accept-Encoding':'identity',
'Accept-Charset' : 'utf8',
}
}
const req = get(url, options, (res) => {
res.setEncoding('utf8');
let responseBody = '';
res.on('data', (chunk) => responseBody += chunk);
res.on('end', () => resolve(responseBody));
});
req.on('error', (err) => reject(err));
req.end();
})
And then invoking it like this:
const questionsUrl = 'https://api.stackexchange.com/2.3/questions?&site=stackoverflow&filter=total'
const resp = await getRequest(questionsUrl)
console.log(resp)
However, I get the response:
▼�
�V*�/I�Q�22�454�0�♣��♥‼���↕
What I've Tried
I've tried doing several variations of the following:
I'm calling setEncoding to utf8 on the stream
I've set the Accept header to text/* - which
Provides a text MIME type, but without a subtype
I've set the Accept-Encoding header to identity - which
Indicates the identity function (that is, without modification or compression)
This code also works just fine with pretty much any other API server, for example using the following url:
https://jsonplaceholder.typicode.com/todos/1
But the StackOverlow API works anywhere else I've tried it, so there must be a way to instruct node how to execute it.
My suggestion is to use an http library that supports both promises and gzip built in. My current favorite is got(). http.get() is like the least featured http request library anywhere. You really don't have to write all this yourself. Here's what your entire code would look like with the got() library:
const got = require('got');
function getRequest(url) {
return got(url).json();
}
This library handles all these things you need for you automatically:
Promises
JSON conversion
Gzip decoding
2xx status detection (other status codes like 404 are turned into a promise rejection which your code does not do).
And, it has many, many other useful features for other general use. The days of coding manually with http.get() should be long over. No need to rewrite code that has already been written and well-tested for you.
FYI, there's a list of very capable http libraries here: https://github.com/request/request/issues/3143. You can pick the one that has the API you like the best.
Response Header - Content-Encoding - Gzip
As jfriend00 pointed out - looks like the server isn't respecting the Accept-Encoding value being passed and returning a gzipped response none-the-less.
Unzipping Response
According to the answer on How do I ungzip (decompress) a NodeJS request's module gzip response body?, you can unzip like this:
import { get } from 'https'
import { createGunzip } from 'zlib'
const getRequest = (url: string) => new Promise((resolve, reject) => {
const req = get(url, (res) => {
const buffer: string[] = [];
if (!res.headers['content-encoding']?.includes('gzip')) {
console.log('utf8')
res.on('data', (chunk) => buffer.push(chunk));
res.on('end', () => resolve(buffer.join("")))
} else {
console.log('gzip')
const gunzip = createGunzip();
res.pipe(gunzip);
gunzip.on('data', (data) => buffer.push(data.toString()))
gunzip.on("end", () => resolve(buffer.join("")))
gunzip.on("error", (e) => reject(e))
}
});
req.on('error', (err) => reject(err));
req.end();
})

Append a query param to a GET request?

I'm trying to make a simple API that calls another API that will return some information. The thing is, in order to connect to the second API, I need to attach query parameters to it.
So what I've tried to do so far is to use an axios.get in order to fetch the API. If I didn't need to add queries on top of that, then this would be really simple but I'm having a really hard time trying to figure out how to attach queries on top of my request.
I've created an object that pulled the original query from my end and then I used JSON.stringify in order to turn the object I made into a JSON. Then, from my understanding of Axios, you can attach params my separating the URL with a comma.
On line 6, I wasn't sure if variables would carry over but I definitely can't have the tag var turned into the string "tag", so that's why I left it with the curly brackets and the back ticks. If that's wrong, then please correct me as to how to do it properly.
the var tag is the name of the query that I extracted from my end. That tag is what needs to be transferred over to the Axios GET request.
app.get('/api/posts', async (req, res) => {
try {
const url = 'https://myurl.com/blah/blah';
let tag = req.query.tag;
objParam = {
tag: `${tag}`
};
jsonParam = JSON.stringify(objParam);
let response = await axios.get(url, jsonParam);
res.json(response);
} catch (err) {
res.send(err);
}
});
response is SUPPOSED to equal a JSON file that I'm making the request to.
What I'm actually getting is a Error 400, which makes me think that somehow, the URL that Axios is getting along with the params aren't lining up. (Is there a way to check where the Axios request is going to? If I could see what the actual url that axios is firing off too, then it could help me fix my problem)
Ideally, this is the flow that I want to achieve. Something is wrong with it but I'm not quite sure where the error is.
-> I make a request to MY api, using the query "science" for example
-> Through my API, Axios makes a GET request to:
https://myurl.com/blah/blah?tag=science
-> I get a response with the JSON from the GET request
-> my API displays the JSON file
After looking at Axios' README, it looks like the second argument needs the key params. You can try:
app.get('/api/posts', async (req, res, next) => {
try {
const url = 'https://myurl.com/blah/blah';
const options = {
params: { tag: req.query.tag }
};
const response = await axios.get(url, options);
res.json(response.data);
} catch (err) {
// Be sure to call next() if you aren't handling the error.
next(err);
}
});
If the above method does not work, you can look into query-string.
const querystring = require('query-string');
app.get('/api/posts', async (req, res, next) => {
try {
const url = 'https://myurl.com/blah/blah?' +
querystring.stringify({ tag: req.params.tag });
const response = await axios.get(url);
res.json(response.data);
} catch (err) {
next(err);
}
});
Responding to your comment, yes, you can combine multiple Axios responses. For example, if I am expecting an object literal to be my response.data, I can do:
const response1 = await axios.get(url1)
const response2 = await axios.get(url2)
const response3 = await axios.get(url3)
const combined = [
{ ...response1.data },
{ ...response2.data },
{ ...response3.data }
]

Return XML in Firebase Cloud Function

I am trying to set up a cloud function that returns an xml. I am able to create and log the xml, but it crashes with the following error when I try to return it.
TypeError: Converting circular structure to JSON
at Object.stringify (native)
at stringify (/var/tmp/worker/node_modules/express/lib/response.js:1119:12)
at ServerResponse.json (/var/tmp/worker/node_modules/express/lib/response.js:260:14)
at ServerResponse.send (/var/tmp/worker/node_modules/express/lib/response.js:158:21)
at cors (/user_code/index.js:663:21)
at cors (/user_code/node_modules/cors/lib/index.js:188:7)
at /user_code/node_modules/cors/lib/index.js:224:17
at originCallback (/user_code/node_modules/cors/lib/index.js:214:15)
at /user_code/node_modules/cors/lib/index.js:219:13
at optionsCallback (/user_code/node_modules/cors/lib/index.js:199:9)
My Function
exports.sendXMLResponeSample = functions.https.onRequest((request, response) => {
cors(request, response, () => {
// import xmlbuilder
const builder = require('xmlbuilder');
// create my object to convert to xml
var myFeedObject = {
"somekey": "some value",
"age": 59,
"eye color": "brown"
}
// convert myFeedObject to xml
const feed = builder.create(myFeedObject, { encoding: 'utf-8' })
console.log("feed.end({ pretty: true }) = (below)");
console.log(feed.end({ pretty: true }));
// return xml
return response.send(200, feed) // <<< error occurs here
})
})
I believe the error suggests the the firebase cloud function is expecting I return a JSON object in the response rather than an xml object, but I am unsure how to tell it to expect an xml object in the response.
Does anyone understand how to return an xml object in a firebase cloud function?
EDIT: The object is converted to an xml object without any issue. The error occurs when the xml object is attempted to be returned.
You can use the .contentType(type: string) on the response object that the cloud function returns to the caller.
Like so:
res.status(200)
.contentType('text/xml; charset=utf8')
.send(xmlString);
You may install the object-to-xml library, and then set the response data type in the response header to text/XML, something like res.header('Content-type','text/xml').
This is what I'm doing.
const xmlString =
'<?xml version="1.0" encoding="UTF-8"?><Response><Message><Body>This is the
response</Body></Message></Response>';
res
.set("Content-Type", "text/xml; charset=utf8")
.status(200)
.send(xmlString);
Works for me. I'm sure there is a better way to convert your XML to a string.

NodeJS - Request file and zip it

I am currently in the process of creating a REST API for my personal website. I'd like to include some downloads and I would like to offer the possibility of selecting multiple ones and download those as a zip file.
My first approach was pretty easy: Array with urls, request for each of them, zip it, send to user, delete. However, I think that this approach is too dirty considering there are things like streams around which seems to be quite fitting for this thing.
Now, I tried around and am currently struggling with the basic concept of working with streams and events throughout different scopes.
The following worked:
const r = request(url, options);
r.on('response', function(res) {
res.pipe(fs.createWriteStream('./file.jpg'));
});
From my understanding r is an incoming stream in this scenario and I listen on the response event on it, as soon as it occurs, I pipe it to a stream which I use to write to the file system.
My first step was to refactor this so it fits my case more but I already failed here:
async function downloadFile(url) {
return request({ method: 'GET', uri: url });
}
Now I wanted to use a function which calls "downloadFile()" with different urls and save all those files to the disk using createWriteStream() again:
const urls = ['https://download1', 'https://download2', 'https://download3'];
urls.forEach(element => {
downloadFile(element).then(data => {
data.pipe(fs.createWriteStream('file.jpg'));
});
});
Using the debugger I found out that the "response" event is non existent in the data object -- Maybe that's already the issue? Moreover, I figured that data.body contains the bytes of my downloaded document (a pdf in this case) so I wonder if I could just stream this to some other place?
After reading some stackoveflow threads I found the following module: archiver
Reading this thread: Dynamically create and stream zip to client
#dankohn suggested an approach like that:
archive
.append(fs.createReadStream(file1), { name: 'file1.txt' })
.append(fs.createReadStream(file2), { name: 'file2.txt' });
Making me assume I need to be capable of extracting a stream from my data object to proceed.
Am I on the wrong track here or am I getting something fundamentally wrong?
Edit: lmao thanks for fixing my question I dunno what happened
Using archiver seems to be a valid approach, however it would be advisable to use streams when feeding large data from the web into the zip archive. Otherwise, the whole archive data would need to be held in memory.
archiver does not support adding files from streams, but zip-stream does. For reading a stream from the web, request comes in handy.
Example
// npm install -s express zip-stream request
const request = require('request');
const ZipStream = require('zip-stream');
const express = require('express');
const app = express();
app.get('/archive.zip', (req, res) => {
var zip = new ZipStream()
zip.pipe(res);
var stream = request('https://loremflickr.com/640/480')
zip.entry(stream, { name: 'picture.jpg' }, err => {
if(err)
throw err;
})
zip.finalize()
});
app.listen(3000)
Update: Example for using multiple files
Adding an example which processes the next file in the callback function of zip.entry() recursively.
app.get('/archive.zip', (req, res) => {
var zip = new ZipStream()
zip.pipe(res);
var queue = [
{ name: 'one.jpg', url: 'https://loremflickr.com/640/480' },
{ name: 'two.jpg', url: 'https://loremflickr.com/640/480' },
{ name: 'three.jpg', url: 'https://loremflickr.com/640/480' }
]
function addNextFile() {
var elem = queue.shift()
var stream = request(elem.url)
zip.entry(stream, { name: elem.name }, err => {
if(err)
throw err;
if(queue.length > 0)
addNextFile()
else
zip.finalize()
})
}
addNextFile()
})
Using Async/Await
You can encapsulate it into a promise to use async/await like:
await new Promise((resolve, reject) => {
zip.entry(stream, { name: elem.name }, err => {
if (err) reject(err)
resolve()
})
})
zip.finalize()

Node.Js Slack App Https request 'Uncaught Exception: Converting circular structure to JSON'

I am trying to create a slack app, using stdlib that can take a simple text argument and then use that information to make an https request to a url and get a json response back. However, when I try to make this request and output the string to slack as a text response, I get the following error:
Critical Error: TypeError: Converting circular structure to JSON
at JSON.stringify (<anonymous>)
at Domain.criticalExecution.run ([eval]:86:45)
at Domain.run (domain.js:242:14)
at callback ([eval]:66:23)
at func.index.apply.params.concat ([eval]:199:51)
at module.exports (/var/task/functions/commands/price.js:23:3)
at Domain.functionExecution.run ([eval]:197:22)
at Domain.run (domain.js:242:14)
at process.nextTick ([eval]:196:27)
at _combinedTickCallback (internal/process/next_tick.js:131:7)
Now, I can't see anywhere in the response json that makes it cyclical, since it is just a text response, I don't add to it, so it confuses me.
Below is the source code:
module.exports = (user, channel, text = '', command = {}, botToken = null, callback) => {
callback(null, {
response_type: 'in_channel',
text: getString()
});
};
function getString() {
return getNasaData('DEMO_KEY');
}
function getNasaData(key = 'DEMO_KEY', callback) {
return https.get({
host: 'api.nasa.gov',
path: '/planetary/apod?api_key=${key}'
}, (response) => {
let body = '';
response.on('data', (data) => {
body += data;
});
response.on('end', () => {
let parsed = JSON.parse(body);
callback({
copyright: parsed.copyright,
date: parsed.date
});
});
});
};
I've looked around for solutions and its unclear what would cause this exception other than a cyclical reference. Perhaps it is to do with the callbacks I'm using in the two methods?
Thank you for any help.

Categories

Resources