Nodejs http server cannot handle, proxy enabled request from Amazon Load balancer - javascript

I have a nodeja HTTP server which was woking good until I enabled Proxy at amazon Load balancer (which is on TCP protocol) to get client's IP.
I wonder How TCP server of nodejs works perfectly , but http server cannot
var net = require('net');
var proxy_protocol = require('node-proxy-protocol');
net.createServer(function(socket) {
proxy_protocol.parse(socket, function(error, obj) {
console.log(obj); //returns required client's info
});
});
but, why http server fails to do , if I replace "net" with "http":
This does not work
var net = require('http');
var proxy_protocol = require('node-proxy-protocol');
http.createServer(function(req,res) {
proxy_protocol.parse(req, function(error, obj) {
console.log(obj); //returns nothing
});
});
although I know that HTTP (usually) operates over TCP, so it must work for both.
basically I think, My Http Server is not able to handle TCP request from Load Balancer.
Please let me know where am I going wrong.

My nodejs HTTP server was not handling TCP from Load Balancer, so I switched Load Balancer's Protocol to HTTP.and now my HTTP server works well.

Related

Socket.IO attempting to connect through https:// instead of wss:// and getting a CORS error

I'm switching from JavaScript's vanilla WebSocket API to Socket.IO for real-time data about cryptocurrency prices. While using the regular WebSocket I had no problem connecting to Kraken and getting the data I need. However, when trying to connect with Socket.IO, I get a CORS error.
Access to XMLHttpRequest at 'https://ws.kraken.com/socket.io/?EIO=3&transport=polling&t=Mxg8_5_' from origin has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource.
In in the Chrome dev tools network tab, I'm getting an Invalid request response from Kraken. I assume Socket.IO is trying to send some sort of preflight request when trying to establish a websocket connection and failing due to Kraken's CORS policy for http requests. Is there a way to completely bypass this XMLHttpRequest attempt and immediately try a websocket connection, seeing as the regular WebSocket API has no issues establishing this connection and doesn't seem to send a preflight request? Here are both the vanilla and the Socket.IO sockets:
// vanilla websocket
const vanillaWS = new WebSocket('wss://ws.kraken.com');
vanillaWS.onopen = () => {
console.log('vanilla websocket opened');
}
vanillaWS.onmessage = (message) => {
console.log(message.data);
}
// socket.io websocket
const ioSocket = io('wss://ws.kraken.com');
ioSocket.on('connect', () => {
console.log('socket.io socket opened');
});
ioSocket.on('message', (message) => {
console.log(message.data);
});
As you can see, these should be functionally very similar, but while the first one works as expected, the second one is throwing the error.
From the documentation:
What Socket.IO is not
Socket.IO is NOT a WebSocket implementation. Although Socket.IO indeed
uses WebSocket as a transport when possible, it adds some metadata to
each packet: the packet type, the namespace and the packet id when a
message acknowledgement is needed. That is why a WebSocket client will
not be able to successfully connect to a Socket.IO server, and a
Socket.IO client will not be able to connect to a WebSocket server
either. Please see the protocol specification
here.
So if the endpoint you're trying to use isn't running a Socket.IO server, this isn't going to work.
That said, if it is, you can force the use of websockets using the transports parameter:
const ioSocket = io(endpoint, {
transports: ['websocket'] // forces websockets only
});
Bottom Line: Socket.IO is not a replacement for a WebSockets connection. Socket.IO uses WebSockets to accomplish its goal: "Socket.IO is a library that enables real-time, bidirectional and event-based communication between the browser and the server".
You're getting the CORS error because socket.io attempts pure HTTP-based long-polling connection first and that's what fails. You should manually set your client to attempt websocket first:
var options = {
allowUpgrades: true,
transports: ['websocket', 'polling'],
};
var sock = io(server, options);
sock.on('connect', () => {
console.log('socket.io socket opened');
});
sock.on('message', (message) => {
console.log(message.data);
});
From the socket.io docs:
With websocket transport only
By default, a long-polling connection is established first, then
upgraded to “better” transports (like WebSocket). If you like to live
dangerously, this part can be skipped:
const socket = io({ transports: ['websocket'] });
// on reconnection, reset the transports option, as the Websocket //
connection may have failed (caused by proxy, firewall, browser, ...)
socket.on('reconnect_attempt', () => { socket.io.opts.transports =
['polling', 'websocket']; });

Rest API based TCP client using Node js

I have tried to create TCP Client with rest api using nodejs and also used net module to establish tcp connection to send/receive data. The main idea is to use this restAPI from browser to load test TCP Connections.
Here in my case there are 2 steps involved while load testing TCP.
1) send initial TCP request which has token for authentication.
2) then send other TCP request to send some data.
The issue is when i try to send 2nd TCP request after authentication. Getting response as invalid session.
Please suggest if i can send TCP request for authentication and using same session/connection while making subsequent requests.
I am new to node js. My Apologize if I have not provided enough details or done some thing invalid.
Initially I have used Packet Sender application and enabled persistent TCP Connection option in it. It worked well as expected but this is for single user and cant use this tool for load testing. Here in this tool with persistent TCP enabled I can see the local port is fixed and not changing upon sending multiple requests but with my node js code i can see the local port is getting changed upon every new request.
I have also used TCP Sampler in Jmeter with reuse Connection option but not working when i send 2nd request after authentication.
var Net = require('net');
var express = require("express");
var bodyParser = require('body-parser');
var app = express();
app.use(bodyParser.json());
app.post('/api/push', function (req, res) {
var reqBody = req.body.reqBody;
var req = JSON.stringify(reqBody);
const client = new Net.Socket({
allowHalfOpen: true
});
client.connect({
port: req.body.port,
host: req.body.host
}, function () {
client.write(req);
});
client.on('data', function (chunk) {
res.write(chunk.toString());
//Tried to use client connection information, but didnt worked not sure if i missed something.
console.log(JSON.stringify(client));
// Tried commenting below client.end but no luck.
client.end();
});
client.on('end', function () {
res.end();
});
client.on('error', function (err) {
console.log("Error: " + err.message);
res.write(err.message);
client.end();
});
});
app.listen(1234, () => {
console.log("Server running on port 1234");
});
1) send restAPI with TCP server host/port and request body for authentication.
2) send another restAPI to use same TCP connection and send data. but it failed for mere
Inspect the behavior and get the cookies details and preserve the same in HTTP cookie manager to reuse the same session for the second request. Just adding http cookie manager also might work. Please check,

How to handle tcp/ip raw requests and http requests on the same server

I am working on a gps tracking system and have built a server on node js.
This is how the file looks like for reference.
const net = require('net');
const lora_packet = require('lora-packet');
const dataParser = require('./dataParser');
const clients = [];
net.createServer(function(socket) {
socket.name = socket.remoteAddress + ":" + socket.remotePort;
clients.push(socket);
socket.on('data', function(data) {
console.log("Buffer sent by terminal : ");
console.log(data);
const packet = lora_packet.fromWire(data, 'hex');
const str = packet._packet.PHYPayload.toString('hex');
dataParser.parse_data(str, socket);
});
socket.on('end', function() {
clients.splice(clients.indexOf(socket), 1);
//broadcast(socket.name + "has left the cartel.\n");
});
function broadcast(message, sender) {
clients.forEach(function(client) {
if (client === sender) client.write(message + "\nsent\n");
return;
client.write(message);
});
process.stdout.write(message);
}
}).listen(8080);
console.log("cartel is running on the port 8080\n");
This server file handles only requests from the hardware and processes raw tcp/ip requests.
I want the server to handle http requests also and want to incorporate routing feature in the server too for client side applicarions for browser.
1) Is there any way that http requests can also be handled by the same server or should I open another port and deploy an express node js app on that?
2) If I use the same 8080 port for http, how can the routing be achieved?
3) If I use different ports for http and raw tcp/ip, what would be the best way for communication between the two server. The communication between tcp/ip server and http server should happen via socket(sending data dynamically).
From http server using socket, data has to be sent dynamically to browser to update live location
So is the flow right?
Hardware (<---->)TCP/IP server(<--->)Http server(<--->)Browser
If more information is needed to solve the query, I'll provide with that!
Thank you
It's very complicated to try to speak multiple protocols on the same port. It requires some sort of scheme at the beginning of each connection to sample the incoming data and identify which protocol it is and then shunt that connection off to the right code to handle that protocol. I wouldn't suggest it.
It is way, way easier to just open a second server on a different port for an Express server to field your http requests. Very simple. You can do it right in the same app. Because both servers can be in the same app, you can just directly read from one connection and write to the other. There's no need for interprocess communication.
Is there any way that http requests can also be handled by the same server or should I open another port and deploy an express node js app on that?
Open another port. No need to write another app unless you have a specific reason to use two processes. You can put both the plain TCP server and the Express server in the same node.js app.
If I use the same 8080 port for http, how can the routing be achieved?
It's not easy. Not suggest to use the same port for multiple protocols.
If I use different ports for http and raw tcp/ip, what would be the best way for communication between the two server. The communication between tcp/ip server and http server should happen via socket(sending data dynamically).
You can put both servers in the same node.js app and then you can just read/write directly from one to the other with the same code. No need for interprocess communication.
From http server using socket, data has to be sent dynamically to browser to update live location
Sending data dynamically to a browser usually means you want the browser to hold something like a webSocket or socket.io connection to your server so you can then send data to the browser at any time over the existing connection. Otherwise, you would have to "wait" for the browser to request data and then respond with the data when it asks.

Connect client to Node.js server with HTTP(S)

Users 'gather' data on their local pc, and they need to be able to upload it to the server.
I setup a simple node.js server like this:
var https = require('https');
var fs = require('fs');
var path = require('path');
var options = {
key: fs.readFileSync('key.pem'),
cert: fs.readFileSync('cert.pem')
};
var server = https.createServer(options, function (request, response) {
response.writeHead(200, {'Content-Type': 'text/bold'});
response.end("Server is running");
});
Now I want to connect a client to it, using a httprequest. I tried JQuery/XMLhttpRequest but I get Cross-origin resource sharing errors (which I get why but I think I don't really want to disable this protection). I think it's possible to use sockets to establish the connection but I'm not sure if that would be a good choice. I'd rather want to work with HTTP requests.
var xmlhttp, text;
xmlhttp = new XMLHttpRequest();
xmlhttp.open('GET', 'http://localhost/file.txt', true);
xmlhttp.send();
//JQuery get
$.get("http://127.0.0.1:1337")
Anything missing? Feel free to ask.
Change http:// to https:// in your second set of code. Same-origin includes scheme/protocol. (And your server code appears to only be spinning up an https listener anyway. :) ).
Just a hint : If your client application page which is taking to your server is not rendered from the same domain where your server component is running then you will get such errors. This error means your page is going to access resources from another server, not from the server it is originated. All the browsers have such restriction. You can configure on you server to allow access form other domains (domain where your client app is hosted) or host both client app and server from same domain.

Localhost refuses connection from Javascript

I am making a single web app that takes information from MYSQL database and displays it on screen using AngularJS and NodeJS. I have set up a node.js server that gives out JSON file from a database. It works fine if I connect to it like this: 192.168.1.57:8888 which is my home server ip or remote connecting my server and using localhost:8888. Doing this downloads the JSON file on my computer.
However. If I'm using javascript to get JSON file from the server it gives me this error:
GET http://localhost:8888/ net::ERR_CONNECTION_REFUSED
I have tried connecting to the server with both AngularJS and JQuery and they both give the same error. I've also tried 127.0.0.1 and that doesn't work either. What can I do to fix it or can I do this with a better alternative way?
This is the server code in node.js
var http = require("http");
mysql = require("mysql");
var connection = mysql.createConnection({
user: "test",
password: "test",
database: "test"
});
http.createServer(function (request, response) {
request.on('end', function () {
connection.query('SELECT * FROM feedback;', function (error, rows, fields) {
response.writeHead(200, {
'Content-Type': 'x-application/json'
});
response.end(JSON.stringify(rows));
});
});
}).listen(8888);
this is the client side in angularJS:
(function(){
var app = angular.module('question', [ ]);
app.controller("ServerController", [ '$http', function($http){
$http.get("http://localhost:8888").success(function(data,status,headers,config){
console.log("success");
}).error(function(data,status,headers,config){
console.log("error");
});
} ]);
}
)();
Taking a guess at it, you're running into the Same Origin Policy, which requires that the page making the ajax request be in the same origin as the resource it's requesting (or that the server serving the resource supports CORS and allows the page's origin).
Two different ways this might be hitting you (I can't tell which, the question doesn't give enough information):
If the browser code is running in a page loaded from the file system, not via HTTP, then it's a cross-origin request because the protocols don't match (file: vs. http:).
If the page has not been loaded from localhost:8888, but instead from some other port on localhost, then the origins don't match because the port numbers are different.

Categories

Resources