Closing single node server connection closes them all - javascript

I barely ask any questions on Stack Overflow, but this one is beyond me. I guess I'm missing something basic as I'm pretty new to Node server.
Our application is pretty basic. The server is supposed to receive a handful of text lines (data), merge and parse them, and once the connection is closed (data sending is over) it sends the data to the api.
var net = require('net');
var fs = require('fs');
const axios = require('axios')
const server = new net.Server();
server.listen(PORT, IP);
server.on("connection", client => {
client.write("Hello\n");
console.log('connected');
let received = "";
client.on("data", data => {
received += data
console.log("Partial data is: " + data);
});
client.on("close", () => {
received = received.toString('utf8');
fs.appendFile('log.txt', received, function (err) {});
received = received.replace(/(?:\r\n|\r|\n)/g, "||");
axios.post(APIADDRESS, {data: received});
console.log('Full data is: '+ {data: received});
});
});
To send the data I'm simply running a netcat or nc using the netcat ipaddress port, that's not a problem. It's connecting fine, status message is received.
The thing is - once I open two or more connections from two DIFFERENT SSh servers something weird happens. I can send the line after line just fine. The server reports back "partial data" debug without problem, for both of them.
However, once I close one of the connections (ctrl+c) they BOTH close.
In the end, only the data from the manually closed connection is received. The other one, from a separate nc on a separate ssh server never reaches the client.on("close") part, it seems. It's just terminated for no reason.
Any ideas? I don't even know where to start.
//EDIT
Just tested it from my pc and some ssh mobile app using separated SSH servers. As soon as ctrl+c is sent at any device it closes the connection for all clients.
//Forgot to mention I'm running pm2 to keep the server up. Once I turned on the script by hand, ignoring pm2 - it works fine. Weird. It is happening because of PM2.

I would guess that you have Putty configured to ‘Share SSH connections if possible’. Per some doc, when doing so:
When this mode is in use, the first PuTTY that connected to a given server becomes the ‘upstream’, which means that it is the one managing the real SSH connection. All subsequent PuTTYs which reuse the connection are referred to as ‘downstreams’: they do not connect to the real server at all, but instead connect to the upstream PuTTY via local inter-process communication methods.
So, if you Ctrl+C the PuTTY session that is managing the actual shared connection, they both lose their connection.
You could presumably disable this shared connection feature at either the client or server end of things since both must be enabled for sharing to occur.

To anyone coming here in the future.
If you are using pm2 with --watch enabled and the text log file is in the same folder as your main server script... That's the reason why it drops the connection after a single client disconnects. It just detects that the log has changed.
I'm not facepalming, that's not even funny.

Related

Try to connect to a server with Google Assistance App

I need to send data out from my google assistance app to a database. In order to do this, I've created a server that takes the data, packages it, and then sends it out. I have the hostname and port and it works in a normal javascript/node.js program but when I use it in my google assistant app nothing happens. I tried figuring out the problem and it looks like the code just isn't connecting. The code I'm using to send data to the server is as follows:
function sendData(app){
var net = require('net');
var message = {"test": 200};
var thisMessage = JSON.stringify(message);
var client = new net.Socket();
client.connect(<port>, '<hostname>', function() {
app.tell(JSON.stringify(client.address()));
console.log('Connected');
client.write(thisMessage);
});
client.on('data', function(data) {
console.log('Received: ' + data);
client.destroy();
});
client.on('close', function() {
console.log('Connection closed');
});
return 0;
}
(NOTE: Port and hostname left out for privacy purposes)
This completely skips over the app.tell, leading me to believe the connection is never made. I know it works asynchronously with the server, however, I don't understand why it isn't connecting whatsoever.
I have tried it both in simulation and on my smartphone with sandbox on and off. Is there a better way to connect? Note that the server I'm connecting to is python-based.
The problem is likely that you're running it on Cloud Functions for Firebase which has a limit on outbound connections under their free "Spark" plan. With this plan, you can only connect to other Google services. This is usually a good way to start understanding how to handle Action requests, but has limitations. To access endpoints outside of Google, you need to upgrade to either their "Flame" fixed price plan or "Blaze" pay-as-you-go plan.
You do not, however, need to run on Google's servers or need to use node.js. All you need is a public HTTPS server with a valid SSL cert. If you are familiar with JSON, you can use any programming language to handle the request and response. If you are familiar with node.js, you just need a node.js server that can create Express request and response objects.

Long response time on socket polling with Heroku

My client is connecting to socket server using socket.io 1.0+ lib as:
$scope.socket = io.connect( "/gateway" );
On server side I launch express server and socket server attached as:
httpServer = http.createServer( app ).listen( process.env.PORT, process.env.IP || "0.0.0.0", function() {
io = require( 'socket.io' )( httpServer ).of("/gateway");
io.on('connection', function( socket ) {
// socket events here
}
});
Then the project is tested on heroku. What bother me a lot is this screen from chrome dev tools
You can see there that constantly 2 polling requests are being performed. One gets response in couple of milliseconds, other one takes somewhere around 26 seconds. If I click on one of them I could see that the real difference between them is request method: the one that uses POST gets quick response, the one uses GET remains in pending state until gets response (or timeout) after ~26 seconds.
In my development enviromnent (c9.io) I do not see this behaviour, but in testing (heroku free node) I get this.
Probably because of this I get some other weird behaviour only on heroku, for example on tab close I do not receive a disconnect event, while on c9 I do..
Has anybody faced the same problem? Is there a fix?
As it turns out the long response was mistreated to a visible open connection when polling transport is being used. In this case a 25 second connection is being held open for client-server communication. It is normal, although you have to know about it - some logging modules/solutions may treat it as long response and constantly warn you.
In short
if the sebsocket transport is used there will be one websocket connection open all the time
is the polling transport is used, then every 25-26 seconds a new GET connection will be (re)established.

Node.js Remote Start and Communication between Servers

I am new to Node.js and also pretty new to server communication.
I have tried to find previous answers, but they are often concerned about communication between server and client.
I have a different case, so need your considerate helps.
Let's assume a scenario that we have three systems, localhost (i.e., laptop) and two cloud servers. I want to code an js app in the localhost that will slice an array of data into two blocks and send them to the cloud servers (block #1 to the server #1 and block #2 to the server #2). Receiving them, two remote servers start to work at the same time. Then, they do the same computation and send their calculation results to each other if they have updated values.
In this scenario, I want to tackle bolded sentences. I believe using the module "socket.io" will be a proper approach to handle this (especially, remote start and communication) but do not have any clear idea in designing codes. In addition, understanding "socket.io" itself is a bit tricky. If you need further specification on the scenario, please comment.
Along with socket.io, check out a module named Faye (http://faye.jcoglan.com/node.html). I have been using it for a couple of years and really like it. Faye is a publish subscribe communication scheme which would allow you to extend your scenario to as many clients as you need. To install faye on your system, run the following command:
npm install -g faye
Now, here is your server code:
var faye = require('faye');
var Server = new faye.NodeAdapter({mount: ('/FayeServer'), timeout: 120});
//now fire the server up on the port 5555
Server.listen(5555);
//subscribe to channel DataChannel
var Subscription = Server.getClient().subscribe("DataChannel",
function(dataObject){ console.log(dataObject) },
function(status) {
console.log('Subscription Status: ' + status);
//send message with two numbers to any client listening to DataChannel
Server.getClient().publish('/DataChannel', {A:5,B:12});
});
Now, here is the client code:
var faye = require('faye');
//open client to server
var Client = new faye.Client('http://127.0.0.1:5555/FayeServer');
//now subscribe to the channel DataChannel
Client.subscribe('/DataChannel', function(dataObject)
{
Client.publish('/DataChannel', {C:(dataObject.A * dataObject.B)};
});
There is a lot more that can be done, but with this basic framework you can stand up server to N client programs that respond to messages from the server.
You will need to replace 127.0.0.1 with your specific URL and use port numbers and channel names more applicable to your specific application.

Proper way to monitor/control a server remotely over http in realtime

On my client (a phone with a browser) I want to see the stats of the server CPU,RAM & HDD and gather info from various logs.
I'm using ajax polling.
On the client every 5 sec (setInterval) I call a PHP file:
scan a folder containing N logs
read the last line of each log
convert that to JSON
Problems:
Open new connection every 5 sec.
Multiple AJAX calls.
Request headers (they are also data and so consume bandwidth)
Response headers (^)
Use PHP to read files every 5 sec. even if nothing changed.
The final JSON data is less than 5 KB, but I send it every 5 sec, and there are the headers and new connection every time, so basically every 5 sec., I have to send 5-10 KB to get 5 KB which are 10-20 KB.
Those are 60 sec / 5 sec = 12 new connections per minute and about 15 MB per hour of traffic if I leave the app open.
Lets say I have 100 users that I let monitor / control my server that would be around 1.5 GB outgoing traffic in one hour.
Not to mention that the PHP server is reading multiple files 100 times every 5 sec.
I need something that on the server reads the last lines of those logs every 5 sec and maybe writes them to a file, then I want to push this data to the client only if it's changed.
SSE (server sent events) with PHP
header('Content-Type: text/event-stream');
header('Cache-Control: no-cache');
while(true){
echo "id: ".time()."\ndata: ".ReadTheLogs()."\n\n";
ob_flush();
flush();
sleep(1);
}
In this case after the connection is established with the first user
the connection keeps open (PHP is not made for that) and so I save some space (request headers,response headers). This work on my server bu most server don't allow to keep the connection open for long time.
Also with multiple users I read the log multiple times.(slowing down my old server)
And I can't control the server ... I would need to use ajax to send a command...
I need WebSockets!!!
node.js and websockets
using node.js, from what i understand, i can do all this without consuming alot
of resources and bandwich. The connection keeps open so no unnecessary headers, i can recieve and send data.it handles multiple users very well.
And this is where i need your help.
the node.js server should in background update, and store the logs data every 5 sec if the files are modified.OR should that do the operating system with (iwatch,dnotify...)
the data should be pushed only if changed.
the reading of the logs should be happen only one time after 5 sec ... so not triggered by each user.
this is the first example i have found.....and modified..
var ws=require("nodejs-websocket");
var server=ws.createServer(function(conn){
var data=read(whereToStoreTheLogs);
conn.sendText(data)// send the logs data to the user
//on first connection.
setTimeout(checkLogs,5000);
/*
here i need to continuosly check if the logs are changed.
but if i use setInterval(checkLogs,5000) or setTimeout
every user invokes a new timer and so having lots of timers on the server
can i do that in background?
*/
conn.on("text",function(str){
doStuff(str); // various commands to control the server.
})
conn.on("close",function(code,reason){
console.log("Connection closed")
})
}).listen(8001);
var checkLogs=function(){
var data=read(whereToStoreTheLogs);
if(data!=oldData){
conn.sendText(data)
}
setTimeout(checkLogs,5000);
}
the above script would be the notification server, but i also need to find a solution to store somwhere the info of those multiple logs and do that everytime something is changed, in the background.
How would you do to keep the bandwich low but also the server resources.
How would you do?
EDIT
Btw. is there a way to stream this data simultaneosly to all the clinets?
EDIT
About the logs: i also want to be able to scale the time dilatation between updates... i mean if i read the logs of ffmpeg i ned the update every sec if possible... but when no conversion is active.. i need to get the basic machine info every 5min maybe ... and so on...
GOALS:
1. performant way to read & store somewhere the logs data (only if clinets connected...[mysql,file, it's possible to store this info inside the ram(with node.js??)]).
2. performant way to stream the data to the various clients (simultanously).
3. be able to send commands to the server.. (bidirectional)
4. using web languages (js,php...), lunix commands( something that is easy to implement on multiple machines).. free software if needed.
best approach would be:
read the logs, based on current activity, to the system memory and stream simultaneously and continuosly, with an already open connection, to the various clients with webSockets.
i'don't know anything that could be faster.
UPDATE
The node.js server is up and running, using the http://einaros.github.io/ws/ webSocketServer implementation, as it appears to be the fastest one.
I wrote with the help of #HeadCode the following code to handle properly the client situation & to keep the process as low as possible. checking various things inside the broadcast loop. Now the pushing & the client handling is at a good point.
var
wss=new (require('ws').Server)({port:8080}),
isBusy,
logs,
clients,
i,
checkLogs=function(){
if(wss.clients&&(clients=wss.clients.length)){
isBusy||(logs=readLogs()/*,isBusy=true*/);
if(logs){
i=0;
while(i<clients){
wss.clients[i++].send(logs)
}
}
}
};
setInterval(checkLogs,2000);
But atm i'm using a really bad way to parse the logs.. (nodejs->httpRequest->php).. lol. After some googling i found out that i totally could stream the output of linux software directly to the nodejs app ... i didn't checked... but maybe that would be the best way to do it. node.js also has a filesystem api where icould read the logs. linux has it's own filesystem api.
the readLogs()(can be async) function is still something i'm not happy with.
nodejs filesystem?
linuxSoftware->nodejs output implementation
linux filesystem api.
keep in mind that i need to scan various folders for logs and then parse somehow the outputted data, and this every 2 seconds.
ps.: i adde isBusy to the server variables in case the logReading sytem is async.
EDIT
Answer is not complete.
Missing:
A performant way to read,parse and store the logs somewhere (linux filesystem api, or nodejs api, so the i store directly into system memory)
An explaination if it's possible to stream data directly to multiple users .
apparently nodejs loops trough the clients and so (i think) sending multiple times the data.
btw is it possible/worth to close the node server if there are no clients and restart on new connections on the apache side. (ex: if i connect to the apache hosted html file a script launches the nodejs server again). doing so would further reduce the memory leaking???right?
EDIT
After some experimenting with websockets (some videos are in the comments) i learned some new stuff. Raspberry PI has the possibility to use some CPU DMA channels to to high frequency stuff like PWM... i need to somehow understand how that works.
When using sensors and stuff like that i should store everything inside the RAM, nodejs already does that?? (in a variable inside the script)
websocket remains the best choice as it's basically easely accessible from any device now, simply using a browser.
I haven't used nodejs-websocket, but it looks like it will accept an http connection and do the upgrade as well as creating the server. If all you care about receiving is text/json then I suppose that would be fine, but it seems to me you might want to serve a web page along with it.
Here is a way to use express and socket.io to achieve what you're asking about:
var express = require('express');
var app = express();
var http = require('http').Server(app);
var io = require('socket.io')(http);
app.use(express.static(__dirname + '/'));
app.get('/', function(req, res){
res.sendfile('index.html');
});
io.on('connection', function(socket){
// This is where we should emit the cached values of everything
// that has been collected so far so this user doesn't have to
// wait for a changed value on the monitored host to see
// what is going on.
// This code is based on something I wrote for myself so it's not
// going to do anything for you as is. You'll have to implement
// your own caching mechanism.
for (var stat in cache) {
if (cache.hasOwnProperty(stat)) {
socket.emit('data', JSON.stringify(cache[stat]));
}
}
});
http.listen(3000, function(){
console.log('listening on *:3000');
});
(function checkLogs(){
var data=read(whereToStoreTheLogs);
if(data!=oldData){
io.emit(data)
}
setTimeout(checkLogs,5000);
})();
Of course, the checkLogs function has to be fleshed out by you. I have only cut and pasted it in here for context. The call to the emit function of the io object will send the message out to all connected users but the checkLogs function will only fire once (and then keep calling itself), not every time someone connects.
In your index.html page you can have something like this. It should be included in the html page at the bottom, just before the closing body tag.
<script src="/path/to/socket.io.js"></script>
<script>
// Set up the websocket for receiving updates from the server
var socket = io();
socket.on('data', function(msg){
// Do something with your message here, such as using javascript
// to display it in an appropriate spot on the page.
document.getElementById("content").innerHTML = msg;
});
</script>
By the way, check out the Nodejs documentation for a variety of built-in methods for checking system resources (https://nodejs.org/api/os.html).
Here's also a solution more in keeping with what it appears you want. Use this for your html page:
<!DOCTYPE HTML>
<html>
<head>
<meta charset="utf-8">
<title>WS example</title>
</head>
<body>
<script>
var connection;
window.addEventListener("load", function () {
connection = new WebSocket("ws://"+window.location.hostname+":8001")
connection.onopen = function () {
console.log("Connection opened")
}
connection.onclose = function () {
console.log("Connection closed")
}
connection.onerror = function () {
console.error("Connection error")
}
connection.onmessage = function (event) {
var div = document.createElement("div")
div.textContent = event.data
document.body.appendChild(div)
}
});
</script>
</body>
</html>
And use this as your web socket server code, recently tweaked to use the 'tail' module (as found in this post: How to do `tail -f logfile.txt`-like processing in node.js?), which you will have to install using npm (Note: tail makes use of fs.watch, which is not guaranteed to work the same everywhere):
var ws = require("nodejs-websocket")
var os = require('os');
Tail = require('tail').Tail;
tail = new Tail('./testlog.txt');
var server = ws.createServer(function (conn) {
conn.on("text", function (str) {
console.log("Received " + str);
});
conn.on("close", function (code, reason) {
console.log("Connection closed");
});
}).listen(8001);
setInterval(function(){ checkLoad(); }, 5000);
function broadcast(mesg) {
server.connections.forEach(function (conn) {
conn.sendText(mesg)
})
}
var load = '';
function checkLoad(){
var new_load = os.loadavg().toString();
if (new_load === 'load'){
return;
}
load = new_load;
broadcast(load);
}
tail.on("line", function(data) {
broadcast(data);
});
Obviously this is very basic and you will have to change it for your needs.
I had made a similar implementation recently using Munin . Munin is a wonderful server monitoring tool, open source too which also provides a REST API. There several plugins available for your needs monitoring CPU, HDD and RAM usage of your server.
You need to build a push notification server. All clients who are listening, will then get a push notification when new data is updated. See this answer for more information: PHP - Push Notifications
As to how you would update the data, I'd suggest using OS-based tools to trigger a PHP script (command line) that will generate an "push" the json file out to any client currently listening. Any new client logging on to "listen" will get served the current json available, until it's updated.
This way you're not subject to 100 users using 100 connections and how much ever bandwidth to poll your server every 5 seconds, and only get updated when they need to know there's an update.
How about a service that reads all the log info (via IPMI, Nagios or whatever) and creates the output files on some schedule. Then anyone that wants to connect can just read this output rather than hammering the server logs. Essentially have one hit on the server logs then everyone else just reads a web page.
This could be implemented pretty easily.
BTW: Nagios has a v nice free edition
Answering just these bits of your question:
performant way to stream the data to the various clients (simultanously).
be able to send commands to the server.. (bidirectional)
using web languages (js,php...), lunix commands( something that is easy to implement on multiple machines).. free software if needed.
I'll recommend the Bayeux protocol as made simple by the CometD project. There are implementations in a variety of languages and it's really easy to use in its simplest form.
Meteor is broadly similar. It's an application development framework rather than a family of libraries, but it solves the same problems.
Some suggestions:
Munin for charts
NetSNMP (used by Munin, but you can also use Bash and Cron to build traps that send SMS texts on alerts)
Pingdom for remote alerts about how well the server is responding to ping and HTTP checks. It can SMS text you or call a phone, as well as have call escalation rules.

Error while getting messges from sockets in javascript

Hi i am having trouble with creating a socket communication from java script code.
I am always getting error while sending a message or closing the socket from server.
My Socket server code.
// Start listening for connections.
while (true)
{
Console.WriteLine("Waiting for a connection...");
// Program is suspended while waiting for an incoming connection.
Socket handler = listener.Accept();
data = null;
// An incoming connection needs to be processed.
while (true)
{
int bytesRec = handler.Receive(bytes);
data += Encoding.ASCII.GetString(bytes, 0, bytesRec);
break;
}
// Show the data on the console.
Console.WriteLine("Text received : {0}", data);
// Echo the data back to the client.
byte[] msg = Encoding.ASCII.GetBytes(data);
handler.Send(msg);
handler.Shutdown(SocketShutdown.Both);
handler.Close();
}
JavaScript code:
var connection = new WebSocket('ws://Myip:11000', ['soap', 'xmpp']);
// When the connection is open, send some data to the server
connection.onopen = function () {
connection.send('Ping'); // Send the message 'Ping' to the server
connection.send('your message');
};
// Log errors
connection.onerror = function (error) {
console.log('WebSocket Error ' + error);
};
connection.onclose = function (msg) {
console.log('WebSocket Error ' + msg);
};
It gets connected to server socket, but always gets error while closing or sending a message from server.
If this is really your actual code:
handler.Send(msg);
handler.Shutdown(SocketShutdown.Both);
handler.Close();
…then it's pretty broken. First, you can't assume that Socket.Send() actually sends all the bytes you asked it to. You have to check the return value, and keep sending until you've actually sent all the data.
Second, the initiation of a graceful closure should use SocketShutdown.Send, not SocketShutdown.Both. Specifying "Both" means (among other things) that you're not going to wait for the other end to negotiate the graceful closure. That you're just done and won't even receive any more data, in addition to being done sending.
And of course, the code is calling Close() before the other end has in fact acknowledged the graceful closure (by itself sending any remaining data it wanted to send and then shutting down with "Both").
Is all this the reason for your problem? I can't say for sure, since I have no way to test your actual code. But it's certainly a reasonable guess. If you tear down the connection without waiting after you try to send something, there's not any guarantee that the data will ever leave your machine, and in any case the other end could easily see the connection reset before it gets a chance to process any data that was sent to it.
There aren't a huge number of rules when it comes to socket programming, but what rules exist are there for a reason and are generally really important to follow. You should make sure your code is following all the rules.
(The rest of the code is also different from what I'd consider the right way to do things, but the problems aren't entirely fatal, the way that the over-eager connection destruction is).
I am afraid WebSocket does not work that way.
When the Javascript code connects to the server, it will send a HTTP request as ASCII text. That request will include a HTTP header Sec-WebSocket-Protocol: soap, xmpp, as you are requiring those protocols in your WebSocket creation.
Since your server code does not reply with an appropiate HTTP response accepting the websocket connection, the connection will fail. When you try to send data back, the client will not recognize it as a HTTP response and a error will be thrown.
A websocket is not a regular socket connection, it won't work that way. It requires HTTP negotiation and there is a schema for data framing. I recommend you to go through this article that explains very well how it works: http://chimera.labs.oreilly.com/books/1230000000545/ch17.html
If you are interested in learning how to develop a server, take a look to this tutorial in MDN: https://developer.mozilla.org/en-US/docs/WebSockets/Writing_WebSocket_server I also have an open source WebSocket server in C# you can take a look if you like.

Categories

Resources