Rest API based TCP client using Node js - javascript

I have tried to create TCP Client with rest api using nodejs and also used net module to establish tcp connection to send/receive data. The main idea is to use this restAPI from browser to load test TCP Connections.
Here in my case there are 2 steps involved while load testing TCP.
1) send initial TCP request which has token for authentication.
2) then send other TCP request to send some data.
The issue is when i try to send 2nd TCP request after authentication. Getting response as invalid session.
Please suggest if i can send TCP request for authentication and using same session/connection while making subsequent requests.
I am new to node js. My Apologize if I have not provided enough details or done some thing invalid.
Initially I have used Packet Sender application and enabled persistent TCP Connection option in it. It worked well as expected but this is for single user and cant use this tool for load testing. Here in this tool with persistent TCP enabled I can see the local port is fixed and not changing upon sending multiple requests but with my node js code i can see the local port is getting changed upon every new request.
I have also used TCP Sampler in Jmeter with reuse Connection option but not working when i send 2nd request after authentication.
var Net = require('net');
var express = require("express");
var bodyParser = require('body-parser');
var app = express();
app.use(bodyParser.json());
app.post('/api/push', function (req, res) {
var reqBody = req.body.reqBody;
var req = JSON.stringify(reqBody);
const client = new Net.Socket({
allowHalfOpen: true
});
client.connect({
port: req.body.port,
host: req.body.host
}, function () {
client.write(req);
});
client.on('data', function (chunk) {
res.write(chunk.toString());
//Tried to use client connection information, but didnt worked not sure if i missed something.
console.log(JSON.stringify(client));
// Tried commenting below client.end but no luck.
client.end();
});
client.on('end', function () {
res.end();
});
client.on('error', function (err) {
console.log("Error: " + err.message);
res.write(err.message);
client.end();
});
});
app.listen(1234, () => {
console.log("Server running on port 1234");
});
1) send restAPI with TCP server host/port and request body for authentication.
2) send another restAPI to use same TCP connection and send data. but it failed for mere

Inspect the behavior and get the cookies details and preserve the same in HTTP cookie manager to reuse the same session for the second request. Just adding http cookie manager also might work. Please check,

Related

Socket.IO attempting to connect through https:// instead of wss:// and getting a CORS error

I'm switching from JavaScript's vanilla WebSocket API to Socket.IO for real-time data about cryptocurrency prices. While using the regular WebSocket I had no problem connecting to Kraken and getting the data I need. However, when trying to connect with Socket.IO, I get a CORS error.
Access to XMLHttpRequest at 'https://ws.kraken.com/socket.io/?EIO=3&transport=polling&t=Mxg8_5_' from origin has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource.
In in the Chrome dev tools network tab, I'm getting an Invalid request response from Kraken. I assume Socket.IO is trying to send some sort of preflight request when trying to establish a websocket connection and failing due to Kraken's CORS policy for http requests. Is there a way to completely bypass this XMLHttpRequest attempt and immediately try a websocket connection, seeing as the regular WebSocket API has no issues establishing this connection and doesn't seem to send a preflight request? Here are both the vanilla and the Socket.IO sockets:
// vanilla websocket
const vanillaWS = new WebSocket('wss://ws.kraken.com');
vanillaWS.onopen = () => {
console.log('vanilla websocket opened');
}
vanillaWS.onmessage = (message) => {
console.log(message.data);
}
// socket.io websocket
const ioSocket = io('wss://ws.kraken.com');
ioSocket.on('connect', () => {
console.log('socket.io socket opened');
});
ioSocket.on('message', (message) => {
console.log(message.data);
});
As you can see, these should be functionally very similar, but while the first one works as expected, the second one is throwing the error.
From the documentation:
What Socket.IO is not
Socket.IO is NOT a WebSocket implementation. Although Socket.IO indeed
uses WebSocket as a transport when possible, it adds some metadata to
each packet: the packet type, the namespace and the packet id when a
message acknowledgement is needed. That is why a WebSocket client will
not be able to successfully connect to a Socket.IO server, and a
Socket.IO client will not be able to connect to a WebSocket server
either. Please see the protocol specification
here.
So if the endpoint you're trying to use isn't running a Socket.IO server, this isn't going to work.
That said, if it is, you can force the use of websockets using the transports parameter:
const ioSocket = io(endpoint, {
transports: ['websocket'] // forces websockets only
});
Bottom Line: Socket.IO is not a replacement for a WebSockets connection. Socket.IO uses WebSockets to accomplish its goal: "Socket.IO is a library that enables real-time, bidirectional and event-based communication between the browser and the server".
You're getting the CORS error because socket.io attempts pure HTTP-based long-polling connection first and that's what fails. You should manually set your client to attempt websocket first:
var options = {
allowUpgrades: true,
transports: ['websocket', 'polling'],
};
var sock = io(server, options);
sock.on('connect', () => {
console.log('socket.io socket opened');
});
sock.on('message', (message) => {
console.log(message.data);
});
From the socket.io docs:
With websocket transport only
By default, a long-polling connection is established first, then
upgraded to “better” transports (like WebSocket). If you like to live
dangerously, this part can be skipped:
const socket = io({ transports: ['websocket'] });
// on reconnection, reset the transports option, as the Websocket //
connection may have failed (caused by proxy, firewall, browser, ...)
socket.on('reconnect_attempt', () => { socket.io.opts.transports =
['polling', 'websocket']; });

How to handle tcp/ip raw requests and http requests on the same server

I am working on a gps tracking system and have built a server on node js.
This is how the file looks like for reference.
const net = require('net');
const lora_packet = require('lora-packet');
const dataParser = require('./dataParser');
const clients = [];
net.createServer(function(socket) {
socket.name = socket.remoteAddress + ":" + socket.remotePort;
clients.push(socket);
socket.on('data', function(data) {
console.log("Buffer sent by terminal : ");
console.log(data);
const packet = lora_packet.fromWire(data, 'hex');
const str = packet._packet.PHYPayload.toString('hex');
dataParser.parse_data(str, socket);
});
socket.on('end', function() {
clients.splice(clients.indexOf(socket), 1);
//broadcast(socket.name + "has left the cartel.\n");
});
function broadcast(message, sender) {
clients.forEach(function(client) {
if (client === sender) client.write(message + "\nsent\n");
return;
client.write(message);
});
process.stdout.write(message);
}
}).listen(8080);
console.log("cartel is running on the port 8080\n");
This server file handles only requests from the hardware and processes raw tcp/ip requests.
I want the server to handle http requests also and want to incorporate routing feature in the server too for client side applicarions for browser.
1) Is there any way that http requests can also be handled by the same server or should I open another port and deploy an express node js app on that?
2) If I use the same 8080 port for http, how can the routing be achieved?
3) If I use different ports for http and raw tcp/ip, what would be the best way for communication between the two server. The communication between tcp/ip server and http server should happen via socket(sending data dynamically).
From http server using socket, data has to be sent dynamically to browser to update live location
So is the flow right?
Hardware (<---->)TCP/IP server(<--->)Http server(<--->)Browser
If more information is needed to solve the query, I'll provide with that!
Thank you
It's very complicated to try to speak multiple protocols on the same port. It requires some sort of scheme at the beginning of each connection to sample the incoming data and identify which protocol it is and then shunt that connection off to the right code to handle that protocol. I wouldn't suggest it.
It is way, way easier to just open a second server on a different port for an Express server to field your http requests. Very simple. You can do it right in the same app. Because both servers can be in the same app, you can just directly read from one connection and write to the other. There's no need for interprocess communication.
Is there any way that http requests can also be handled by the same server or should I open another port and deploy an express node js app on that?
Open another port. No need to write another app unless you have a specific reason to use two processes. You can put both the plain TCP server and the Express server in the same node.js app.
If I use the same 8080 port for http, how can the routing be achieved?
It's not easy. Not suggest to use the same port for multiple protocols.
If I use different ports for http and raw tcp/ip, what would be the best way for communication between the two server. The communication between tcp/ip server and http server should happen via socket(sending data dynamically).
You can put both servers in the same node.js app and then you can just read/write directly from one to the other with the same code. No need for interprocess communication.
From http server using socket, data has to be sent dynamically to browser to update live location
Sending data dynamically to a browser usually means you want the browser to hold something like a webSocket or socket.io connection to your server so you can then send data to the browser at any time over the existing connection. Otherwise, you would have to "wait" for the browser to request data and then respond with the data when it asks.

Electron NodeJS server to server communication using POST

I am working on an assignment for school, and I have decided to make a chat application using Electron and NodeJS. All of the GUI is programmed, except for the server-side of things. My plan was to have two servers, where each would act as its own client AND server, only communicating with each other to send messages.
How would I get each server to communicate using POST requests? Does anybody know any fully functioning npm modules that can be used for this?
you need to use in server A : socket.io
in server B: socket.io-client
Like this:
server A
// Load requirements
var http = require('http'),
io = require('socket.io');
// Create server & socket
var server = http.createServer(function(req, res)
{
// Send HTML headers and message
res.writeHead(404, {'Content-Type': 'text/html'});
res.end('<h1>404</h1>');
});
server.listen(8080);
io = io.listen(server);
// Add a connect listener
io.sockets.on('connection', function(socket)
{
console.log('Client connected.');
// Disconnect listener
socket.on('disconnect', function() {
console.log('Client disconnected.');
});
});
server B
// Connect to server
var io = require('socket.io-client');
var socket = io.connect('http://localhost:8080', {reconnect: true});
// Add a connect listener
socket.on('connect', function(socket) {
console.log('Connected!');
});
This can be done with React js, there's quite examples on github.
Take a look at this examples:
https://github.com/ncuillery/react-chat-project
https://github.com/keithyong/chat-room
It's nice to see someone using Electron, I've just finished my first project with it, and I'm amazed.
As #Arcath has stated, you must use socket.io, it talks between frontend and backend. Whenever someone sends a chat message, React.js handles that message, and emits a socket message which the server recieves. The server then adds the socket message into the database.

Nodejs http server cannot handle, proxy enabled request from Amazon Load balancer

I have a nodeja HTTP server which was woking good until I enabled Proxy at amazon Load balancer (which is on TCP protocol) to get client's IP.
I wonder How TCP server of nodejs works perfectly , but http server cannot
var net = require('net');
var proxy_protocol = require('node-proxy-protocol');
net.createServer(function(socket) {
proxy_protocol.parse(socket, function(error, obj) {
console.log(obj); //returns required client's info
});
});
but, why http server fails to do , if I replace "net" with "http":
This does not work
var net = require('http');
var proxy_protocol = require('node-proxy-protocol');
http.createServer(function(req,res) {
proxy_protocol.parse(req, function(error, obj) {
console.log(obj); //returns nothing
});
});
although I know that HTTP (usually) operates over TCP, so it must work for both.
basically I think, My Http Server is not able to handle TCP request from Load Balancer.
Please let me know where am I going wrong.
My nodejs HTTP server was not handling TCP from Load Balancer, so I switched Load Balancer's Protocol to HTTP.and now my HTTP server works well.

Express4 + Socket.io 0.9: different clients get the same non broadcast message

This is a web site using authentication via passport.js.
Two different users connect from different browsers and they request info about their username. The server gets the information and send them back using socket.io.
Everything works like a charm but if the two clients load the page at the same time, the information of one of them goes to both browsers, looks like the server is writing on the same socket.
Server Side:
server.js:
var express = require('express');
var app = express();
var http = require('http').createServer(app).listen(8000),
io = require('socket.io').listen(http);
socket.js:
module.exports = function(app, io) {
...
io.sockets.on('connection', function(socket) {
...
//Build the information about the user and send it back
var userData = userInfo();
socket.emit('userInfo', userData);
...
}
}
Client side (javascript file included in index.ejs):
var socket = io.connect('http://URL:8000');
...
socket.emit("all", {data}); //Hi, I need information about me.
...
socket.on('userInfo', function (data) {
// do some stuff...
});
Server debug in console gets info about the two sockets:
debug - client authorized
info - handshake authorized Cq71N34XLyAJBTIbHCZQ
debug - setting request GET /socket.io/1/websocket/Cq71N34XLyAJBTIbHCZQ
debug - set heartbeat interval for client Cq71N34XLyAJBTIbHCZQ
debug - client authorized for
debug - websocket writing 1::
...
debug - client authorized
info - handshake authorized UF6lOwOFzgjrWY54HCZP
debug - setting request GET /socket.io/1/websocket/UF6lOwOFzgjrWY54HCZP
debug - set heartbeat interval for client UF6lOwOFzgjrWY54HCZP
debug - client authorized for
debug - websocket writing 1::
I´ve been rewriting different parts of the app but I can´t get why the server answers same info to different sockets.

Categories

Resources