Storing request body as global variable - javascript

I have a project that requires the use of requests and is using javascript run by node. I am having trouble storing the body of a request to a global variable so I can access it in other functions. Is there a way to save the response in a global variable? Thanks.
var request = require('request');
var globalBody = "";
var options = {
url: 'http://www.google.com/',
headers: {
'User-Agent': 'Mozilla/5.0 (iPhone; CPU iPhone OS 6_1_4 like Mac OS X) AppleWebKit/536.26 (KHTML, like Gecko) Version/6.0 Mobile/10B350 Safari/8536.25'
}
}
request(options, function (error, response, body) {
if(error != null){
console.log('error:', error);
}
if(response.statusCode != 200){
console.log('statusCode:', response && response.statusCode);
}else{
globalBody = body;
}
});
console.log(globalBody)
The last line "console.log(globalBody)" results in "" but I want it to display the body of the request. Is there any way to do this?

use window.localStorage to store globally

console.log(globalBody) gets called before the request method callback. globalBody value gets updated but after you are trying to print it.Try the following code it will work.
var request = require('request');
var globalBody = "";
var options = {
url: 'http://www.google.com/',
headers: {
'User-Agent': 'Mozilla/5.0 (iPhone; CPU iPhone OS 6_1_4 like Mac OS X) AppleWebKit/536.26 (KHTML, like Gecko) Version/6.0 Mobile/10B350 Safari/8536.25'
}
};
request(options, function (error, response, body) {
if(error != null){
console.log('error:', error);
}
if(response.statusCode != 200){
console.log('statusCode:', response && response.statusCode);
}else{
globalBody = body;
console.log(globalBody);
}
});
You need to call other functions after the successful completion of async request to use global variables in it.

Related

react native fetch not getting the same content as post man

Im having a little problem with my request on getting an html from https://readnovelfull.com/beauty-and-the-beast-wolf-hubby-xoxo/chapter-1-i-would-not-be-responsible.html as example.
I can get all the html on the other url eg novel detalj, latest upgated etc.
but not when im getting the detali for the chapters.
I tested those url on postman and also on https://codebeautify.org/source-code-viewer as well and there is no problem on getting the content of the chapter of which it exist under the div #chr-content
So I am a bit lost now, what am I doing wrong?
Here is my fetch calls which is working on other novel sites.
static async getHtml(
url: string
): Promise<HTMLDivElement> {
console.log(`Sending html request to ${url}`);
var container = parse('<div>test</div>') as any;
try {
let headers = new Headers({
Accept: '*/*',
'User-Agent':
'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/87.0.4280.141 Safari/537.36'
});
var data = await fetch(url, {
method: 'GET',
headers: headers,
});
if (!data.ok) {
const message = `An error has occured:${data.status}`;
console.log(message);
} else {
var html = await data.text();
console.log('Data is ok. proceed to parse it');
container = parse('<div>' + html + '</div>') as any;
}
} catch (e) {
console.log(e);
}
return container as HTMLDivElement;
}
I should mention that am not getting any error what so ever, its just that the html I am getting is not the same as postman and other site is getting.
Update
Ok so i did some research on the site and this is what i come up with.
the site need X-CSRF-TOKEN and i was able to extract those and find those values
const csrf = 'x09Q6KGqJOJJx2iHwNQUa_mYfG4neV9EOOMsUBKTItKfNjSc0thQzwf2HvCR7SQCqfIpC2ogPj18jG4dQPgVtQ==';
const id = 774791;
which i need to send a request to https://readnovelfull.com/ajax/increase-chapter-views with the values above. and this will send back true/false
now i tried to inc the csrf on my fetch call after but its still the same old same no data.
any idee if i am doing something wrong still?
Looks like you have an issue with CORS. To make sure just try to send request through cors proxy. One of the ways you can quickly do that is add prefix URL:
https://cors-anywhere.herokuapp.com/https://readnovelfull.com/beauty-and-the-beast-wolf-hubby-xoxo/chapter-1-i-would-not-be-responsible.html`
NOTE: Using this CORS proxy on production is not recommended, because it's not secure
If after that you'll receive data, that means that you faced with CORS, and you need to figure out how to solve it in your specific case.
Reproducable example:
const parse = (str) => str;
const getHtml = async (url) => {
console.log(`Sending html request to ${url}`);
var container = parse('<div>No content =(</div>')
try {
let headers = new Headers({
Accept: '*/*',
'User-Agent':
'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/87.0.4280.141 Safari/537.36'
});
var data = await fetch(url, {
method: 'GET',
headers: headers,
});
if (!data.ok) {
const message = `An error has occured:${data.status}`;
console.log(message);
} else {
var html = await data.text();
console.log('Data is ok. proceed to parse it');
container = parse('<div>' + html + '</div>');
}
} catch (e) {
console.log(e);
}
return container;
}
getHtml('https://cors-anywhere.herokuapp.com/https://readnovelfull.com/beauty-and-the-beast-wolf-hubby-xoxo/chapter-1-i-would-not-be-responsible.html').then(htmlContent => document.querySelector('div').innerHTML = htmlContent);
<div>loading...</div>
If it doesn't help, please provide a reproducible RN example, but I believe there is no difference between RN and web environments in that case.

How to send GET request without downloading response content using node-requests?

I'm currently learning node and i'm looking for HTTP library that would allow me to send GET request, without downloading server response content (body).
I need to send very large amount of http requests every minute. However i do not need to read their content (also to save bandwidth). I can't use HEAD for this purpose.
Is there any way to avoid downloading response body using node-requests, or perhaps any other library - could be used?
My sample code using node-request:
const options = {
url: "https://google.com",
headers: {
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/75.0.3770.100 Safari/537.36'
}
}
//How to avoid downloading a whole response?
function callback(err, response, body) {
console.log(response.request.uri.host + ' - ' + response.statusCode);
}
request(options, callback);
HTTP GET by standard fetches the file content, you cannot avoid downloading(getting response) it but you can ignore it. Which is basically what you are doing.
request(options, (err, response, body)=>{
//just return from here don't need to process anything
});
EDIT1:
To just use some bytes of the response, you can use http.get and get the data using the data event. From the doc:
http.get('http://nodejs.org/dist/index.json', (res) => {
res.setEncoding('utf8');
let rawData = '';
res.on('data', (chunk) => { rawData += chunk; });
res.on('end', () => {
//this is when the response will end
});
}).on('error', (e) => {
console.error(`Got error: ${e.message}`);
});

Socket hangup error using Request or Needle in NodeJS

I am getting a Socket Hangup error when attempting to connect to an external web site, an example of a failing site given below. The code is working for other sites and I can successfully access the test site if I rewrite the code in Python. I am unable to make changes to the external site so am looking to enhance the NodeJS script to handle the connection. Any help appreciated
let request = require("request-promise");
let needle = require("needle");
//Check if can connect to site
let arrTestCases = ["https://compassandstars.com"];
for (x in arrTestCases) {
chkcon(arrTestCases[x])
}
async function chkcon(req) {
let resultJSON = {};
let data;
try {
console.log ("Validating blog " + req);
//Attempt using RequestPromise, fails with same error
// let getURL = await request({
// url: req,
// headers: {
// 'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/68.0.3440.106 Safari/537.36'
// }
//// headers: {
//// 'User-Agent': 'ouruseragent',
//// 'Connection': 'keep-alive'
//// }
// })
needle('get', req)
.then(function(response) {
resultJSON = {ValidURL: true};
console.log("URL Validated successfully", response.body);
})
.catch(function(err) {
console.log(err);
})
} catch (e) {
console.log("Bad Blog URL ",e.message);
} finally {
console.log("Result: " + JSON.stringify(resultJSON), req);
}
The error response:
Result: {} https://compassandstars.com
{ Error: socket hang up
at TLSSocket.onConnectEnd (_tls_wrap.js:1073:19)
at Object.onceWrapper (events.js:219:13)
at TLSSocket.emit (events.js:132:15)
at endReadableNT (_stream_readable.js:1101:12)
at process._tickCallback (internal/process/next_tick.js:152:19)
code: 'ECONNRESET',
path: null,
host: 'compassandstars.com',
port: 443,
localAddress: undefined }
I can recreate the issue using NodeJS 8.10 on AWS Lambda and locally on my machine using NodeJS 9.6.1.
My research indicates it may be an error finding a compatible cipher to make the SSL connection but I'm unable to find out how to force Request to change the request to handle this.

How to get host name in http response in node.js?

I am using http module of node.js for making a request. I have bunch of urls in database. I am fetching these urls from database and making requests in loop. But when response comes, I want to get host name of that response, because I want to update something in database based on that response. But I am not getting for which site I am getting response, so I am unable to update record for that site.
Code is something like this:
for (site = 0; site < no_of_sites; site++) {
options = {
hostname: sites[site].name,
port: 80,
method: 'GET',
headers: {
'User-Agent': 'Mozilla/5.0 (Windows NT 6.1; rv:11.0) Gecko/20100101 Firefox/11.0'
}
};
var req = http.request(options, function (res) {
console.log('HEADERS: ' + JSON.stringify(res.headers));
if (res.statusCode == 200) {
//Update record;
}
});
}
We can get host site in this object.
console.log(this._header.match(/Host\:(.*)/g));
Option one: use res.req
var req = http.request(options, function (res) {
console.log(res.req._headers.host)
});
Option two: use a closure
for (site = 0; site < no_of_sites; site++) {
(function(){
var options = {
// ...
};
var req = http.request(options, function (res) {
// options available here
console.log(options);
});
}());
}
Option three:
It seems this is the same as res.req in the http.request() callback, but I'm not completely sure.
The answer is
console.log(res.socket._httpMessage._headers.host);

Node.js: Remotely Submitting Forms

I'm currently working on a sort of Web Proxy for Node.js, but I am having trouble with submitting forms, on most sites I am able to successfully submit a form but on some other sites I am not so fortunate. I can't pinpoint if there is anything I'm doing wrong.
Is there a possible better way of doing this?
Also, how would I be able to handle multipart forms using the Express.js bodyparser?
At the moment this is what I have in the way of form processing is this:
function proxy(req, res,request)
{
var sess = req.session;
var onUrl_Parse = function(url){
var Uri= new URI.URI(url);//Parses incoming url
var options = {
uri: url,
method: req.method
}
options.headers={"User-Agent": "Mozilla/5.0 (Windows NT 6.1; rv:6.0) Gecko/20110814 Firefox/6.0", "Cookie":req.session.cook};
if(req.body) //If x-www-form-urlencoded is posted.
{
var options = {
uri: url,
method: req.method,
body: req.rawBody
}
options.headers={"User-Agent": "Mozilla/5.0 (Windows NT 6.1; rv:6.0) Gecko/20110814 Firefox/6.0", "Cookie":req.session.cook, "Content-Type":"application/x-www-form-urlencoded"};
}
onRequestOptions(options, url);
}
,onRequestOptions = function(options, url)
{
request(options, function(error, response, body)
{
if(!error){
if(response.headers['set-cookie'])
req.session.cook=response.headers['set-cookie'];
Proxy_Parser.Parser(body, url, async, onParse);// Parses returned html return displayable content
}
});
}
,onParse = function(HTML_BODY)
{
if(HTML_BODY=="")
res.end();
res.write(HTML_BODY);
res.end();
console.log("DONEEEEE");
}
Url_Parser.Url(req, URI, onUrl_Parse);
}
I am not sure what exactly you are trying to accomplish, but https://github.com/felixge/node-formidable is a anyway recommended !!
I would start with something like node-http-proxy. All the hard work is done for you and you can just define the routes you want to proxy and put in some handlers for the custom response info.

Categories

Resources