Node JS HTTP Proxy hanging up - javascript

I have an http-proxy to proxy any website and inject some custom JS file before to serve the HTML back to the client. Whenever I try to access the proxied website, it will hang up or the browser seems to load indeterminately. But when I check the HTML source, I successfully managed to inject my custom JavaScript file. Here is the code:
const cheerio = require('cheerio');
const http = require('http');
const httpProxy = require('http-proxy');
const { ungzip } = require('node-gzip');
_initProxy(host: string) {
let proxy = httpProxy.createProxyServer({});
let option = {
target: host,
selfHandleResponse: true
};
proxy.on('proxyRes', function (proxyRes, req, res) {
let body = [];
proxyRes.on('data', function (chunk) {
body.push(chunk);
});
proxyRes.on('end', async function () {
let buffer = Buffer.concat(body);
if (proxyRes.headers['content-encoding'] === 'gzip') {
try {
let $ = null;
const decompressed = await ungzip(buffer);
const scriptTag = '<script src="my-customjs.js"></script>';
$ = await cheerio.load(decompressed.toString());
await $('body').append(scriptTag);
res.end($.html());
} catch (e) {
console.log(e);
}
}
});
});
let server = http.createServer(function (req, res) {
proxy.web(req, res, option, function (e) {
console.log(e);
});
});
console.log("listening on port 5051");
server.listen(5051);
}
Can someone please tell me if I am doing anything wrong, it looks like node-http-proxy is dying a lot and can't rely much on it since the proxy can work sometimes and die at the next run, depending on how many times I ran the server.

Your code looked fine so I was curious and tried it.
Although you do log a few errors, you don't handle several cases:
The server returns a body with no response (cheerio will generate an empty HTML body when this happens)
The server returns a response that is not gzipped (your code will silently discard the response)
I made a few modifications to your code.
Change initial options
let proxy = httpProxy.createProxyServer({
secure: false,
changeOrigin: true
});
Don't verify TLS certificates secure: false
Send the correct Host header changeOrigin: true
Remove the if statement and replace it with a ternary
const isCompressed = proxyRes.headers['content-encoding'] === 'gzip';
const decompressed = isCompressed ? await ungzip(buffer) : buffer;
You can also remove the 2 await on cheerio, Cheerio is not async and doesn't return an awaitable.
Final code
Here's the final code, which works. You mentioned that "it looks like node-http-proxy is dying a lot [...] depending on how many times I ran the server." I experienced no such stability issues, so your problems may lie elsewhere if that is happening (bad ram?)
const cheerio = require('cheerio');
const http = require('http');
const httpProxy = require('http-proxy');
const { ungzip } = require('node-gzip');
const host = 'https://github.com';
let proxy = httpProxy.createProxyServer({
secure: false,
changeOrigin: true
});
let option = {
target: host,
selfHandleResponse: true
};
proxy.on('proxyRes', function (proxyRes, req, res) {
console.log(`Proxy response with status code: ${proxyRes.statusCode} to url ${req.url}`);
if (proxyRes.statusCode == 301) {
throw new Error('You should probably do something here, I think there may be an httpProxy option to handle redirects');
}
let body = [];
proxyRes.on('data', function (chunk) {
body.push(chunk);
});
proxyRes.on('end', async function () {
let buffer = Buffer.concat(body);
try {
let $ = null;
const isCompressed = proxyRes.headers['content-encoding'] === 'gzip';
const decompressed = isCompressed ? await ungzip(buffer) : buffer;
const scriptTag = '<script src="my-customjs.js"></script>';
$ = cheerio.load(decompressed.toString());
$('body').append(scriptTag);
res.end($.html());
} catch (e) {
console.log(e);
}
});
});
let server = http.createServer(function (req, res) {
proxy.web(req, res, option, function (e) {
console.log(e);
});
});
console.log("listening on port 5051");
server.listen(5051);

I ended up writing a small Python Server using CherryPy and proxied the web app with mitmproxy. Everything is now working smoothly. Maybe I was doing it wrong with node-http-proxy but I also became sceptic about using it in a production environment.

Related

Why is my Remote Server not appropriately routing?

I am having trouble getting my VPS server set up appropriately. I copied all the files recursively from my local server to the remote server, but post & get requests to the server are not functioning appropriately (404 not found error). The routing works perfectly on my localhost server, but unfortunately it doesn't on the remote.
Server Code:
// getting datastores
const datastore = require('nedb');
const customerRecordsdb = new datastore("CustomerRecords.db");
customerRecordsdb.loadDatabase();
// importing express
const port = 3000;
const express = require('express');
const application = express();
const path = require('path')
const newPath = path.join(__dirname + '../../..')
application.listen(port, () => console.log("Listening at " + port));
application.use(express.json( {limit: '1mb'} ));
application.get('/RetrieveCustomerInformation', (request, response) => {
console.log("SUCCESS!");
customerRecordsdb.find({}, (error, data) => {
if (error) {
response.end();
return;
}
response.json(data);
})
})
Client Code:
// loading data
let numberRecords = 0;
async function loadClientRecords() {
const response = await fetch('/RetrieveCustomerInformation');
const data = await response.json();
for (let i = 0; i < data.length; i++) {
numberRecords++;
insertNewRecord(data[i]);
}
}
loadClientRecords()
.then(response => {
// was successful
})
.catch(error => {
//console.log(error);
})
[VPS Page Output]
https://i.stack.imgur.com/ghoIa.png (The server is not outputting anything)
[Local Host Page output]
https://i.stack.imgur.com/I6mqE.png
(The server is outputting "SUCCESS!" on every refresh)
As previously mentioned, I simply copied the directory over to the remote server.
Any help to what the problem could be will be greatly appreciated!

Firefox can’t establish a connection to the server at wss://localhost:8000/

I am using nodejs to run the server, there is no log file
This is my server.js
const https = require('https');
const fs = require('fs');
const ws = require('ws');
const options = {
key: fs.readFileSync('key.pem'),
cert: fs.readFileSync('cert.pem')
};
const wss = new ws.Server({noServer: true});
function accept(req, res) {
// all incoming requests must be websockets
if (!req.headers.upgrade || req.headers.upgrade.toLowerCase() != 'websocket') {
res.end();
return;
}
// can be Connection: keep-alive, Upgrade
if (!req.headers.connection.match(/\bupgrade\b/i)) {
res.end();
return;
}
wss.handleUpgrade(req, req.socket, Buffer.alloc(0), onConnect);
}
function onConnect(ws) {
ws.on('message', function (message) {
let name = message.match(/([\p{Alpha}\p{M}\p{Nd}\p{Pc}\p{Join_C}]+)$/gu) || "Guest";
ws.send(`${name}!`);
//setTimeout(() => ws.close(1000, "Bye!"), 5000);
});
}
https.createServer(options, function (req, res) {
res.writeHead(200);
res.end("hello world\n");
}).listen(8000);
This is my code in react
componentDidMount() {
var connection = new WebSocket('wss://localhost:8000/');
connection.onopen = function(e) {
connection.send("add people");
};
connection.onmessage = function(event) {
// alert(`[message] Data received from server: ${event.data}`);
console.log("output ", event.data);
};
}
While I am trying to connect with web-socket with my jsx file its give me an error which is Firefox can’t establish a connection to the server at wss://localhost:8000/.
Your implementaion needs some changes. In the backend server, you forgot to call the onConnect function. So your ws.on method will never call.
Also, you imported the ws and create a WebSocket server wss, but you add some event listener on ws wrongly, you should add listener on your Websocket instance (wss):
// rest of the codes ...
const was = new ws.Server({noServer: true})
wss.on('connection`) {
// do something here ...
}
// rest of the codes ...
https.createServer(options, () => {
// do something here ...
})
There are some examples of how to create the WebSocket server along with the HTTP server on ws npm page.

async.waterfall not acting synchronously

I'm trying to write a header of an MD5 hash token using crypto then return it back as a response. For some reason, it isn't actually running synchronously. I know JS is an asynchronous language, and that's really the only part I'm struggling with right now. Any help would be appreciated.
This is what I have so far:
const crypto = require('crypto');
const bodyParser = require('body-parser');
const formidable = require('formidable');
const async = require('async')
app.post('/pushurl/auth', (req, res) =>
var data = req.body.form1data1 + '§' + req.body.form1data2
async.waterfall([
function(callback) {
var token = crypto.createHash('md5').update(data).digest("hex");
callback(null, token);
},
function(token, callback) {
res.writeHead(301,
{Location: '/dashboard?token=' + token}
);
callback(null)
},
function(callback) {
res.end();
callback(null)
}
]);
}
});
Output:
Uncaught Error [ERR_HTTP_HEADERS_SENT]: Cannot set headers after they are sent to the client
<node_internals>/internal/errors.js:256
No debugger available, can not send 'variables'
Process exited with code 1
JavaScript is an asynchronous language, yes, but it can also do synchronous tasks very well. In your case, you don't need to do any async expect if you're dealing with promises.
If you write your code like in the example below it will just execute from top to bottom.
But the error (probably) occurred because you forgot to add an opening curly brace to your app.post callback, which results in the data var being immediately returned because of an implied return statement () => (implied), () => {} (explicit).
const crypto = require('crypto');
const bodyParser = require('body-parser');
const formidable = require('formidable');
app.post('/pushurl/auth', (req, res) => {
const data = req.body.form1data1 + '§' + req.body.form1data2;
const token = crypto.createHash('md5').update(data).digest("hex");
res.writeHead(301, {
Location: '/dashboard?token=' + token
});
res.end();
});

Node proxy server modify response after query in database

I am troubling with nodejs proxy server modified(write) response.
I want to achieve auto login for one site via node proxy server and for that i have to query in database then i can modified response but it seems req ended before req.write and getting Error: write after end
Below is my implementation so far.
var express = require('express');
var proxy = require('http-proxy-middleware');
var options = {
target: 'http://example.com/', // target host
changeOrigin: true,
onProxyReq: function onProxyReq(proxyReq, req, res) {
var _write = res.write;
var body = "";
proxyReq.on('data', function(data) {
data = data.toString('utf-8');
body += data;
});
res.write = function (data) {
try{
//I have database query here instead of setTimeout
setTimeout(function(){
/* Modified response here and write */
_write.call(res, data); //can't write because req already end
},3000);
} catch (err) {
console.log('err',err);
}
}
}
}
// create the proxy (without context)
var exampleProxy = proxy(options);
// mount `exampleProxy` in web server
var app = express();
app.use('/', exampleProxy);
app.listen(8080);
Can anyone guide me how to achieve this ?

Streaming JSON with Node.js

I have been given a task — I'm trying to make routes/endpoints using just Node to better demostrate what is happening under the covers with Express. But I have to use streams.
I have two routes GET /songs and GET/refresh-songs with the following requirements:
We need the following two endpoints:
GET /songs
GET /refresh-songs
All endpoints should return JSON with the correct headers
/songs should...
stream data from a src/data/songs.json file
not crash the server if that file or directory doesn't exist
bonus points for triggering refresh code in this case
bonus points for compressing the response
/refresh-songs should...
return immediately
it should not hold the response while songs are being refreshed
return a 202 status code with a status JSON response
continue getting songs from iTunes
frontend should have
UI/button to trigger this endpoint
This is what I have so far in my server.js file where my endpoints will live.
const { createServer } = require('http');
const { parse: parseUrl } = require('url');
const { createGzip } = require('zlib');
const { songs, refreshSongs } = require('./songs');
const fs = require('fs');
const stream = fs.createReadStream('./src/data/songs.jso')
const PORT = 4000;
const server = createServer(({ headers, method, url }, res) => {
const baseResHeaders = {
// CORS stuff is gone. :(
'content-type' : 'application/json'
};
// Routing ¯\_(ツ)_ /¯
//
var path = url.parseUrl(url).pathname;
function onRequest(request, response){
response.writeHead(200, 'Content-Type': baseResHeaders['content-type']);
stream.on('data', function(err, data){
if (err) {
response.writeHead(404);
response.write('File not found');
refreshSongs();
} else {
response.write(createGzip(data))
}
response.end()
})
}
switch (path) {
case '/songs':
onRequest()
break;
case '/refresh-songs':
onRequest()
break;
}
server.on('listening', () => {
console.log(`Server running at http://localhost:${PORT}/`);
});
I am wondering if I am wiring the onRequest method I created correctly and if the switch statement is correctly going to intercept those URL's

Categories

Resources