Client (JRE) read server (node) variables directly? - javascript

I am trying to set up a server where clients can connect and essentially "raise their hand", which lights up for every client, but only one at a time. I currently just use the express module to send a POST response on button-click. The server takes it as JSON and writes it to a file. All the clients are constantly requesting this file to check the status to see if the channel is clear.
I suspect there is a more streamlined approach for this, but I do not know what modules or methods might be best. Can the server push variables to the clients in some way, instead of the clients constantly requesting a file? Then the client script can receive the variable and change the page elements accordingly?

Usually, this kind of task is done by using WebSockets. Since you already have socket.io set up, it'd be great to reuse it.
From the server, start emitting different messages:
socket.emit("hand", { userId: <string> });
From the client, listen to the new event and invoke whatever the appropriate behavior is:
socket.on("hand", (payload) => {
// payload.userId contains user ID
});

Related

Is it okay to create a net.Socket() 'data' event handler once for each Node.js http request/response?

I've got a Node.js web server communicating with a locally running Python TCP socket server (communicating via their respective socket modules net.Socket, socket).
Clients make HTTP post requests from the browser which get handled by a Node http.createServer function, with some of them sent to the Python server for heavy computations, the results of which are then sent back to Node and back to the browser for rendering.
The Python server is necessary instead of a Node child process as there are some large (immutable) objects required for the Python computations that take a while to initialise and are then shared across threads. It would be infeasible to create and destroy these objects for every browser request.
So my question is, using the Node callback paradigm, how do I capture the response object for each POST request in the net.Socket data event handler/s?
Note 0: Each request has a unique id that is sent to the Python server and returned.
This currently works* inside my http.createServer callback:
http.createServer((request, response) => {
// route and parse incoming requests etc.
// send POST data to Python
python_socket.write(parsed_request_post_data);
// Python works away diligently then emits a data event handled below
python_socket.once('data', (data_from_python) => {
// error and exception handling
response.setHeader('Content-Type', 'application/json');
response.end(data_from_python);
});
}).listen(HTTPport);
*However if I bomb the server with multiple requests, sometimes I get the same data returned in each response (even though Python handles each data separately). I worry that I am trying to assign multiple once('data' callbacks in the same Node event loop and only one of them is persisting, and that is the one repeatedly sending the Python data back to the browser? Though if this were the case the response object would also be getting repeated and I would get an error for trying to end an already closed response right? But each response seems to end fine.
Apologies for the rather long and vague question. I'm still learning and would really appreciate any advice or references I can study to help me understand what is going on. Also very open to trying different approaches (except changing web server - see note 2 below).
Note 1: I tried declaring a global data handler (note the on instead of once) for the net.Socket server as follows, but couldn't figure out how to forward the returned data to each http response?
python_socket.on('data', (data_from_python) => {
// error and exception handling
// how do I get data_from_python out to each http response
// then close it in a non-blocking way?
});
Note 2: I'm not allowed to use a Python web server as the business wants to reuse this design in future to plug and play other services (R, Julia, C++, ...) into Node web servers.

Send message to socketcluster channel from outside worker

There is an external resource I am watching and whenever the data changes I would like to send a message to a specific channel depending on the data.
I found two ways to do this by bonking my head repeatedly against the SC docs:
by watching from the workers, which is not ideal because when there are 1000 clients online the external data source gets hammered with requests from each worker
from a client, this is not ideal because i have to subscribe to the channel, send my message, then unsubscribe
How can I do this from within server.js?
A good example I have of sending data or a message when an external resource is changed, is for that external resource (like a website) to call your socket cluster on/emit through the standard way:
socket.on('documents/upload-file', function (data, respond) {
documents.uploadFile(data, respond, socket);
});
documents.uploadFile I use to upload a file with a base64 string.
I then save it file and store the document into the database.
The code the call a functiondocuments.getFileList which pull all the document info from the db
then that data is publish over a channel:
socket.exchange.publish('channels/documentList', updatedDocumentList);
The whole point of a worker is to handle external incoming events. Without knowing what your external resource is then I can't specifically say if this is the best approach for you, but hopefully it's a good place to start with.

Proper way to monitor/control a server remotely over http in realtime

On my client (a phone with a browser) I want to see the stats of the server CPU,RAM & HDD and gather info from various logs.
I'm using ajax polling.
On the client every 5 sec (setInterval) I call a PHP file:
scan a folder containing N logs
read the last line of each log
convert that to JSON
Problems:
Open new connection every 5 sec.
Multiple AJAX calls.
Request headers (they are also data and so consume bandwidth)
Response headers (^)
Use PHP to read files every 5 sec. even if nothing changed.
The final JSON data is less than 5 KB, but I send it every 5 sec, and there are the headers and new connection every time, so basically every 5 sec., I have to send 5-10 KB to get 5 KB which are 10-20 KB.
Those are 60 sec / 5 sec = 12 new connections per minute and about 15 MB per hour of traffic if I leave the app open.
Lets say I have 100 users that I let monitor / control my server that would be around 1.5 GB outgoing traffic in one hour.
Not to mention that the PHP server is reading multiple files 100 times every 5 sec.
I need something that on the server reads the last lines of those logs every 5 sec and maybe writes them to a file, then I want to push this data to the client only if it's changed.
SSE (server sent events) with PHP
header('Content-Type: text/event-stream');
header('Cache-Control: no-cache');
while(true){
echo "id: ".time()."\ndata: ".ReadTheLogs()."\n\n";
ob_flush();
flush();
sleep(1);
}
In this case after the connection is established with the first user
the connection keeps open (PHP is not made for that) and so I save some space (request headers,response headers). This work on my server bu most server don't allow to keep the connection open for long time.
Also with multiple users I read the log multiple times.(slowing down my old server)
And I can't control the server ... I would need to use ajax to send a command...
I need WebSockets!!!
node.js and websockets
using node.js, from what i understand, i can do all this without consuming alot
of resources and bandwich. The connection keeps open so no unnecessary headers, i can recieve and send data.it handles multiple users very well.
And this is where i need your help.
the node.js server should in background update, and store the logs data every 5 sec if the files are modified.OR should that do the operating system with (iwatch,dnotify...)
the data should be pushed only if changed.
the reading of the logs should be happen only one time after 5 sec ... so not triggered by each user.
this is the first example i have found.....and modified..
var ws=require("nodejs-websocket");
var server=ws.createServer(function(conn){
var data=read(whereToStoreTheLogs);
conn.sendText(data)// send the logs data to the user
//on first connection.
setTimeout(checkLogs,5000);
/*
here i need to continuosly check if the logs are changed.
but if i use setInterval(checkLogs,5000) or setTimeout
every user invokes a new timer and so having lots of timers on the server
can i do that in background?
*/
conn.on("text",function(str){
doStuff(str); // various commands to control the server.
})
conn.on("close",function(code,reason){
console.log("Connection closed")
})
}).listen(8001);
var checkLogs=function(){
var data=read(whereToStoreTheLogs);
if(data!=oldData){
conn.sendText(data)
}
setTimeout(checkLogs,5000);
}
the above script would be the notification server, but i also need to find a solution to store somwhere the info of those multiple logs and do that everytime something is changed, in the background.
How would you do to keep the bandwich low but also the server resources.
How would you do?
EDIT
Btw. is there a way to stream this data simultaneosly to all the clinets?
EDIT
About the logs: i also want to be able to scale the time dilatation between updates... i mean if i read the logs of ffmpeg i ned the update every sec if possible... but when no conversion is active.. i need to get the basic machine info every 5min maybe ... and so on...
GOALS:
1. performant way to read & store somewhere the logs data (only if clinets connected...[mysql,file, it's possible to store this info inside the ram(with node.js??)]).
2. performant way to stream the data to the various clients (simultanously).
3. be able to send commands to the server.. (bidirectional)
4. using web languages (js,php...), lunix commands( something that is easy to implement on multiple machines).. free software if needed.
best approach would be:
read the logs, based on current activity, to the system memory and stream simultaneously and continuosly, with an already open connection, to the various clients with webSockets.
i'don't know anything that could be faster.
UPDATE
The node.js server is up and running, using the http://einaros.github.io/ws/ webSocketServer implementation, as it appears to be the fastest one.
I wrote with the help of #HeadCode the following code to handle properly the client situation & to keep the process as low as possible. checking various things inside the broadcast loop. Now the pushing & the client handling is at a good point.
var
wss=new (require('ws').Server)({port:8080}),
isBusy,
logs,
clients,
i,
checkLogs=function(){
if(wss.clients&&(clients=wss.clients.length)){
isBusy||(logs=readLogs()/*,isBusy=true*/);
if(logs){
i=0;
while(i<clients){
wss.clients[i++].send(logs)
}
}
}
};
setInterval(checkLogs,2000);
But atm i'm using a really bad way to parse the logs.. (nodejs->httpRequest->php).. lol. After some googling i found out that i totally could stream the output of linux software directly to the nodejs app ... i didn't checked... but maybe that would be the best way to do it. node.js also has a filesystem api where icould read the logs. linux has it's own filesystem api.
the readLogs()(can be async) function is still something i'm not happy with.
nodejs filesystem?
linuxSoftware->nodejs output implementation
linux filesystem api.
keep in mind that i need to scan various folders for logs and then parse somehow the outputted data, and this every 2 seconds.
ps.: i adde isBusy to the server variables in case the logReading sytem is async.
EDIT
Answer is not complete.
Missing:
A performant way to read,parse and store the logs somewhere (linux filesystem api, or nodejs api, so the i store directly into system memory)
An explaination if it's possible to stream data directly to multiple users .
apparently nodejs loops trough the clients and so (i think) sending multiple times the data.
btw is it possible/worth to close the node server if there are no clients and restart on new connections on the apache side. (ex: if i connect to the apache hosted html file a script launches the nodejs server again). doing so would further reduce the memory leaking???right?
EDIT
After some experimenting with websockets (some videos are in the comments) i learned some new stuff. Raspberry PI has the possibility to use some CPU DMA channels to to high frequency stuff like PWM... i need to somehow understand how that works.
When using sensors and stuff like that i should store everything inside the RAM, nodejs already does that?? (in a variable inside the script)
websocket remains the best choice as it's basically easely accessible from any device now, simply using a browser.
I haven't used nodejs-websocket, but it looks like it will accept an http connection and do the upgrade as well as creating the server. If all you care about receiving is text/json then I suppose that would be fine, but it seems to me you might want to serve a web page along with it.
Here is a way to use express and socket.io to achieve what you're asking about:
var express = require('express');
var app = express();
var http = require('http').Server(app);
var io = require('socket.io')(http);
app.use(express.static(__dirname + '/'));
app.get('/', function(req, res){
res.sendfile('index.html');
});
io.on('connection', function(socket){
// This is where we should emit the cached values of everything
// that has been collected so far so this user doesn't have to
// wait for a changed value on the monitored host to see
// what is going on.
// This code is based on something I wrote for myself so it's not
// going to do anything for you as is. You'll have to implement
// your own caching mechanism.
for (var stat in cache) {
if (cache.hasOwnProperty(stat)) {
socket.emit('data', JSON.stringify(cache[stat]));
}
}
});
http.listen(3000, function(){
console.log('listening on *:3000');
});
(function checkLogs(){
var data=read(whereToStoreTheLogs);
if(data!=oldData){
io.emit(data)
}
setTimeout(checkLogs,5000);
})();
Of course, the checkLogs function has to be fleshed out by you. I have only cut and pasted it in here for context. The call to the emit function of the io object will send the message out to all connected users but the checkLogs function will only fire once (and then keep calling itself), not every time someone connects.
In your index.html page you can have something like this. It should be included in the html page at the bottom, just before the closing body tag.
<script src="/path/to/socket.io.js"></script>
<script>
// Set up the websocket for receiving updates from the server
var socket = io();
socket.on('data', function(msg){
// Do something with your message here, such as using javascript
// to display it in an appropriate spot on the page.
document.getElementById("content").innerHTML = msg;
});
</script>
By the way, check out the Nodejs documentation for a variety of built-in methods for checking system resources (https://nodejs.org/api/os.html).
Here's also a solution more in keeping with what it appears you want. Use this for your html page:
<!DOCTYPE HTML>
<html>
<head>
<meta charset="utf-8">
<title>WS example</title>
</head>
<body>
<script>
var connection;
window.addEventListener("load", function () {
connection = new WebSocket("ws://"+window.location.hostname+":8001")
connection.onopen = function () {
console.log("Connection opened")
}
connection.onclose = function () {
console.log("Connection closed")
}
connection.onerror = function () {
console.error("Connection error")
}
connection.onmessage = function (event) {
var div = document.createElement("div")
div.textContent = event.data
document.body.appendChild(div)
}
});
</script>
</body>
</html>
And use this as your web socket server code, recently tweaked to use the 'tail' module (as found in this post: How to do `tail -f logfile.txt`-like processing in node.js?), which you will have to install using npm (Note: tail makes use of fs.watch, which is not guaranteed to work the same everywhere):
var ws = require("nodejs-websocket")
var os = require('os');
Tail = require('tail').Tail;
tail = new Tail('./testlog.txt');
var server = ws.createServer(function (conn) {
conn.on("text", function (str) {
console.log("Received " + str);
});
conn.on("close", function (code, reason) {
console.log("Connection closed");
});
}).listen(8001);
setInterval(function(){ checkLoad(); }, 5000);
function broadcast(mesg) {
server.connections.forEach(function (conn) {
conn.sendText(mesg)
})
}
var load = '';
function checkLoad(){
var new_load = os.loadavg().toString();
if (new_load === 'load'){
return;
}
load = new_load;
broadcast(load);
}
tail.on("line", function(data) {
broadcast(data);
});
Obviously this is very basic and you will have to change it for your needs.
I had made a similar implementation recently using Munin . Munin is a wonderful server monitoring tool, open source too which also provides a REST API. There several plugins available for your needs monitoring CPU, HDD and RAM usage of your server.
You need to build a push notification server. All clients who are listening, will then get a push notification when new data is updated. See this answer for more information: PHP - Push Notifications
As to how you would update the data, I'd suggest using OS-based tools to trigger a PHP script (command line) that will generate an "push" the json file out to any client currently listening. Any new client logging on to "listen" will get served the current json available, until it's updated.
This way you're not subject to 100 users using 100 connections and how much ever bandwidth to poll your server every 5 seconds, and only get updated when they need to know there's an update.
How about a service that reads all the log info (via IPMI, Nagios or whatever) and creates the output files on some schedule. Then anyone that wants to connect can just read this output rather than hammering the server logs. Essentially have one hit on the server logs then everyone else just reads a web page.
This could be implemented pretty easily.
BTW: Nagios has a v nice free edition
Answering just these bits of your question:
performant way to stream the data to the various clients (simultanously).
be able to send commands to the server.. (bidirectional)
using web languages (js,php...), lunix commands( something that is easy to implement on multiple machines).. free software if needed.
I'll recommend the Bayeux protocol as made simple by the CometD project. There are implementations in a variety of languages and it's really easy to use in its simplest form.
Meteor is broadly similar. It's an application development framework rather than a family of libraries, but it solves the same problems.
Some suggestions:
Munin for charts
NetSNMP (used by Munin, but you can also use Bash and Cron to build traps that send SMS texts on alerts)
Pingdom for remote alerts about how well the server is responding to ping and HTTP checks. It can SMS text you or call a phone, as well as have call escalation rules.

How does a client display the value of a server-side variable that is constantly changing?

There is a location on my homepage (i.e., in my index.html) that I'd like to display a server-side global variable called serverVariable. Note that serverVariable is constantly changing based off of server-side code running nodejs (i.e., serverVariable is a javascript object).
Unfortunately, I can't just place serverVariable in my index.html since it (i.e., the client) has no idea what it is. Is there a simple way to pass serverVariable to the client so that I can place it in my index.html and have it constantly update depending upon what is going on in the server-side nodejs?
Using socket.io you can establish a persistent connection between the client and server. With that connection you can notify the client of any of your variable's changes.
A simple example could be:
On the server side :
var io = require('socket.io')(http);
io.on('connection', function() {
//When client connects for the first time, send him the value immediately
socket.emit('new_value', serverVariable);
});
//Everytime you change the value, send it to all the clients connected using:
io.emit('new_value', serverVariable);
On the client side :
var socket = io();
socket.on('new_value', function(serverVariable) {
//Display it the way you want!
});
This example should fit your case if serverVariable is a "simple" object, i.e. an object you can directly convert to json. And also if the nodejs server is the server serving the homepage, otherwise you need to adapt the first line.
See http://socket.io/docs/ for more documentation about socket.io.

Private messaging through node.js

I'm making a multiplayer (2 player) browser game in JavaScript. Every move a player makes will be sent to a server and validated before being transmitted to the opponent. Since WebSockets isn't ready for prime time yet, I'm looking at long polling as a method of transmitting the data and node.js looks quite interesting! I've gone through some example code (chat examples, standard long polling examples and suchlike) but all the examples I've seen seem to broadcast everything to every client, something I'm hoping to avoid. For general server messages this is fine but I want two players to be able to square off in a lobby or so and go into "private messaging" mode.
So I'm wondering if there's a way to implement private messaging between two clients using nodejs as a validating bridge? Something like this:
ClientA->nodejs: REQUEST
nodejs: VALIDATE REQUEST
nodejs->ClientA: VALID
nodejs->ClientB: VALID REQUEST FROM ClientA
You need some way to keep track of which clients are in a lobby together. You can do this with a simple global array like so process.lobby[1] = Array(ClientASocket, ClientBSocket) or something similar (possibly with some additional data, like nicknames and such), where the ClientXSocket is the socket object of each client that connects.
Now you can hook the lobby id (1 in this case) onto each client's socket object. A sort of session variable (without the hassle of session ids) if you will.
// i just made a hashtable to put all the data in,
// so that we don't clutter up the socket object too much.
socket.sessionData['lobby'] = 1;
What this allows you to do also, is add an event hook in the socket object, so that when the client disconnects, the socket can remove itself from the lobby array immediately, and message the remaining clients that this client has disconnected.
// see link in paragraph above for removeByValue
socket.on('close', function(err) {
process.lobby[socket.sessionData['lobby']].removeByValue(socket);
// then notify lobby that this client has disconnected.
});
I've used socket in place of the net.Stream or request.connection or whatever the thing is.
Remember in HTTP if you don't have keep-alive connections, this will make the TCP connection close, and so of course make the client unable to remain within a lobby. If you're using a plain TCP connection without HTTP on top (say within a Flash application or WebSockets), then you should be able to keep it open without having to worry about keep-alive. There are other ways to solve this problem than what I've shown here, but I hope I got you started at least. The key is keeping a persistent object for each client.
Disclaimer: I'm not a Node.js expert (I haven't even gotten around to installing it yet) but I have been reading up on it and I'm very familiar with browser js, so I'm hoping this is helpful somehow.

Categories

Resources