Socket.IO and AngularJS creates multiple connections, how to stop? - javascript

I'm developing a realtime socket.io app using AngularJS, Nodejs, and Socket.io both the server and client side libraries. I'm using the module called angular-socket-io however when I tell Angular to connect, the more I refresh the page it seems like multiple connections keep being maintained with Socket.IO and I'm the only user on the page right now.
In my server logs I keep seeing many socket IDs print out and when I refresh the application page, it takes a while for it to reconnect. Watching the console I see it communicating with the server a lot (probably has to do with the multiple socket connection handshakes) but after a good minute or two, it finally stabilizes and starts receiving data again.
I think I'm doing this wrong, or is this normal? Does anyone have any good advice for using Socket.IO in angular so that when the page refreshes it reconnects once and gets rid of all previous connections so that only one is maintained at all times?
Here's some code samples. Just to clarify, the btford.socket-io prefixes all forwarded socket events with "socket:". So in my example it would be "socket:start".
myApp.js
angular.module('myApp', [ 'btford.socket-io', 'myControllers' ])
.config([ 'socketProvider', function(socketProvider) {
var appSocket = io.connect('http://live.myapp.com');
socketProvider.ioSocket(appSocket);
}])
.run(['socket', function(socket) {
socket.forward('error');
socket.forward('start'); // this is the event i'm trying to listen for
});
myController
angular.module('myControllers').controller('startController', [
'$scope',
function($scope) {
$scope.$on('socket:start', function(ev, data) {
$scope.activeDrivers.push({
name: data.user.firstName + " " + data.user.lastName
});
$scope.driversActiveTab.count = $scope.activeDrivers.length;
});
}
]);
So that's it, and I can't figure out why it keeps making so many connections to the server! Thanks for the help in advance!

you should generate a id(ex:random string) in your angular page.when you connect server first.record this id in your server, id is bind your socket. when client disconnect,will call server 'disconnect' event,listen this event and clean the socket.

Answering my own question. Per this page, if you have nodejs instances running across servers or on multiple cores you must use the RedisStore to queue and properly handle socket requests. The strange behavior I described in my question was the browser attempting to connect to one of my 4 cores and missing responses from other cores. I followed the instructions to enable Redis as the data store for SocketIO and all of the problems went away.

Related

Socket io Client in ReactJS is getting multiple emits from the server

I am building a chat room with some extra features, and I have a socket io server, as well as a socket io client in ReactJS.
I have it so if someone pins a message or changes the settings of the chat room, it emits the changes to the server, the server saves those changes, and then emits them back out to everyone so everyone is synced.
The settings and pinned messages successfully transfer and are communicated, I have console.logs at almost every step of the transfer, the client logs out a single request, the server logs that it emits a single time, but the client logs that it recieved an emit multiple times, sometimes a couple of times like 2-6 requests, sometime it gives out 60 requests. I'm trying to really control the efficiency, but I have no idea what is causing this
I'm not sure it matters but another thing of note is that the client also connects to a native WebSocket server to get messages from another source
what the client generally looks like is:
effect(()=>{
socket.emit('changeSetting', setting)
},[setting])
socket.on('recieveSetting', (arg)=>{
if(arg != setting){
setSetting(arg);
}
})
the server then looks like this:
socket.on('changeSetting', (arg)=>{
storedSetting = arg
socket.emit('recieveSetting', storedSetting)
})
That's the general structure, so I don't think its an issue of the code, more like if reactJS or connecting to the other websocket causes it to get multiple emits

netty-socketio: client did not complete upgrade - closing transport

I have a socket server running with netty-socketio and a web app that connects to it using socket.io-client JS library.
The problem is that I'm losing a few connections (not all, let's say 20%).
For the lost connections: right after the connection is made by the client, the server logs client did not complete upgrade - closing transport and disconnects the client.
This happens on my production server (using nginx as proxy) and also on my local environment (connecting directly to the netty-socketio server). It's pretty much random and I cant identify a pattern on it. For example, if I continuously keep refreshing the client app on the browser (with a 5 seconds interval), at some point this error will happen, and for the subsequent tries it will work normal again (until it happens another time).
This is the error on the netty-socketio lib: https://github.com/mrniko/netty-socketio/blob/master/src/main/java/com/corundumstudio/socketio/transport/WebSocketTransport.java#L196
but I could not figure out why it happens randomly (some times at the first try)
Any thoughts on this are really appreciated.
Thanks
After some research and tests I found out that when using netty-socketio as server, you need to specify the transport method on the client side.
var socket = io('server-address', { transports: [ 'polling' ] });
// or
var socket = io('server-address', { transports: [ 'websocket' ] });
If you don't specify it, the connection will be established using polling as transport method and netty will automatically try to upgrade it to websocket. This is what was causing connection failures.
After specifying the transport method I had 0% connection failures so far.

Proper way to monitor/control a server remotely over http in realtime

On my client (a phone with a browser) I want to see the stats of the server CPU,RAM & HDD and gather info from various logs.
I'm using ajax polling.
On the client every 5 sec (setInterval) I call a PHP file:
scan a folder containing N logs
read the last line of each log
convert that to JSON
Problems:
Open new connection every 5 sec.
Multiple AJAX calls.
Request headers (they are also data and so consume bandwidth)
Response headers (^)
Use PHP to read files every 5 sec. even if nothing changed.
The final JSON data is less than 5 KB, but I send it every 5 sec, and there are the headers and new connection every time, so basically every 5 sec., I have to send 5-10 KB to get 5 KB which are 10-20 KB.
Those are 60 sec / 5 sec = 12 new connections per minute and about 15 MB per hour of traffic if I leave the app open.
Lets say I have 100 users that I let monitor / control my server that would be around 1.5 GB outgoing traffic in one hour.
Not to mention that the PHP server is reading multiple files 100 times every 5 sec.
I need something that on the server reads the last lines of those logs every 5 sec and maybe writes them to a file, then I want to push this data to the client only if it's changed.
SSE (server sent events) with PHP
header('Content-Type: text/event-stream');
header('Cache-Control: no-cache');
while(true){
echo "id: ".time()."\ndata: ".ReadTheLogs()."\n\n";
ob_flush();
flush();
sleep(1);
}
In this case after the connection is established with the first user
the connection keeps open (PHP is not made for that) and so I save some space (request headers,response headers). This work on my server bu most server don't allow to keep the connection open for long time.
Also with multiple users I read the log multiple times.(slowing down my old server)
And I can't control the server ... I would need to use ajax to send a command...
I need WebSockets!!!
node.js and websockets
using node.js, from what i understand, i can do all this without consuming alot
of resources and bandwich. The connection keeps open so no unnecessary headers, i can recieve and send data.it handles multiple users very well.
And this is where i need your help.
the node.js server should in background update, and store the logs data every 5 sec if the files are modified.OR should that do the operating system with (iwatch,dnotify...)
the data should be pushed only if changed.
the reading of the logs should be happen only one time after 5 sec ... so not triggered by each user.
this is the first example i have found.....and modified..
var ws=require("nodejs-websocket");
var server=ws.createServer(function(conn){
var data=read(whereToStoreTheLogs);
conn.sendText(data)// send the logs data to the user
//on first connection.
setTimeout(checkLogs,5000);
/*
here i need to continuosly check if the logs are changed.
but if i use setInterval(checkLogs,5000) or setTimeout
every user invokes a new timer and so having lots of timers on the server
can i do that in background?
*/
conn.on("text",function(str){
doStuff(str); // various commands to control the server.
})
conn.on("close",function(code,reason){
console.log("Connection closed")
})
}).listen(8001);
var checkLogs=function(){
var data=read(whereToStoreTheLogs);
if(data!=oldData){
conn.sendText(data)
}
setTimeout(checkLogs,5000);
}
the above script would be the notification server, but i also need to find a solution to store somwhere the info of those multiple logs and do that everytime something is changed, in the background.
How would you do to keep the bandwich low but also the server resources.
How would you do?
EDIT
Btw. is there a way to stream this data simultaneosly to all the clinets?
EDIT
About the logs: i also want to be able to scale the time dilatation between updates... i mean if i read the logs of ffmpeg i ned the update every sec if possible... but when no conversion is active.. i need to get the basic machine info every 5min maybe ... and so on...
GOALS:
1. performant way to read & store somewhere the logs data (only if clinets connected...[mysql,file, it's possible to store this info inside the ram(with node.js??)]).
2. performant way to stream the data to the various clients (simultanously).
3. be able to send commands to the server.. (bidirectional)
4. using web languages (js,php...), lunix commands( something that is easy to implement on multiple machines).. free software if needed.
best approach would be:
read the logs, based on current activity, to the system memory and stream simultaneously and continuosly, with an already open connection, to the various clients with webSockets.
i'don't know anything that could be faster.
UPDATE
The node.js server is up and running, using the http://einaros.github.io/ws/ webSocketServer implementation, as it appears to be the fastest one.
I wrote with the help of #HeadCode the following code to handle properly the client situation & to keep the process as low as possible. checking various things inside the broadcast loop. Now the pushing & the client handling is at a good point.
var
wss=new (require('ws').Server)({port:8080}),
isBusy,
logs,
clients,
i,
checkLogs=function(){
if(wss.clients&&(clients=wss.clients.length)){
isBusy||(logs=readLogs()/*,isBusy=true*/);
if(logs){
i=0;
while(i<clients){
wss.clients[i++].send(logs)
}
}
}
};
setInterval(checkLogs,2000);
But atm i'm using a really bad way to parse the logs.. (nodejs->httpRequest->php).. lol. After some googling i found out that i totally could stream the output of linux software directly to the nodejs app ... i didn't checked... but maybe that would be the best way to do it. node.js also has a filesystem api where icould read the logs. linux has it's own filesystem api.
the readLogs()(can be async) function is still something i'm not happy with.
nodejs filesystem?
linuxSoftware->nodejs output implementation
linux filesystem api.
keep in mind that i need to scan various folders for logs and then parse somehow the outputted data, and this every 2 seconds.
ps.: i adde isBusy to the server variables in case the logReading sytem is async.
EDIT
Answer is not complete.
Missing:
A performant way to read,parse and store the logs somewhere (linux filesystem api, or nodejs api, so the i store directly into system memory)
An explaination if it's possible to stream data directly to multiple users .
apparently nodejs loops trough the clients and so (i think) sending multiple times the data.
btw is it possible/worth to close the node server if there are no clients and restart on new connections on the apache side. (ex: if i connect to the apache hosted html file a script launches the nodejs server again). doing so would further reduce the memory leaking???right?
EDIT
After some experimenting with websockets (some videos are in the comments) i learned some new stuff. Raspberry PI has the possibility to use some CPU DMA channels to to high frequency stuff like PWM... i need to somehow understand how that works.
When using sensors and stuff like that i should store everything inside the RAM, nodejs already does that?? (in a variable inside the script)
websocket remains the best choice as it's basically easely accessible from any device now, simply using a browser.
I haven't used nodejs-websocket, but it looks like it will accept an http connection and do the upgrade as well as creating the server. If all you care about receiving is text/json then I suppose that would be fine, but it seems to me you might want to serve a web page along with it.
Here is a way to use express and socket.io to achieve what you're asking about:
var express = require('express');
var app = express();
var http = require('http').Server(app);
var io = require('socket.io')(http);
app.use(express.static(__dirname + '/'));
app.get('/', function(req, res){
res.sendfile('index.html');
});
io.on('connection', function(socket){
// This is where we should emit the cached values of everything
// that has been collected so far so this user doesn't have to
// wait for a changed value on the monitored host to see
// what is going on.
// This code is based on something I wrote for myself so it's not
// going to do anything for you as is. You'll have to implement
// your own caching mechanism.
for (var stat in cache) {
if (cache.hasOwnProperty(stat)) {
socket.emit('data', JSON.stringify(cache[stat]));
}
}
});
http.listen(3000, function(){
console.log('listening on *:3000');
});
(function checkLogs(){
var data=read(whereToStoreTheLogs);
if(data!=oldData){
io.emit(data)
}
setTimeout(checkLogs,5000);
})();
Of course, the checkLogs function has to be fleshed out by you. I have only cut and pasted it in here for context. The call to the emit function of the io object will send the message out to all connected users but the checkLogs function will only fire once (and then keep calling itself), not every time someone connects.
In your index.html page you can have something like this. It should be included in the html page at the bottom, just before the closing body tag.
<script src="/path/to/socket.io.js"></script>
<script>
// Set up the websocket for receiving updates from the server
var socket = io();
socket.on('data', function(msg){
// Do something with your message here, such as using javascript
// to display it in an appropriate spot on the page.
document.getElementById("content").innerHTML = msg;
});
</script>
By the way, check out the Nodejs documentation for a variety of built-in methods for checking system resources (https://nodejs.org/api/os.html).
Here's also a solution more in keeping with what it appears you want. Use this for your html page:
<!DOCTYPE HTML>
<html>
<head>
<meta charset="utf-8">
<title>WS example</title>
</head>
<body>
<script>
var connection;
window.addEventListener("load", function () {
connection = new WebSocket("ws://"+window.location.hostname+":8001")
connection.onopen = function () {
console.log("Connection opened")
}
connection.onclose = function () {
console.log("Connection closed")
}
connection.onerror = function () {
console.error("Connection error")
}
connection.onmessage = function (event) {
var div = document.createElement("div")
div.textContent = event.data
document.body.appendChild(div)
}
});
</script>
</body>
</html>
And use this as your web socket server code, recently tweaked to use the 'tail' module (as found in this post: How to do `tail -f logfile.txt`-like processing in node.js?), which you will have to install using npm (Note: tail makes use of fs.watch, which is not guaranteed to work the same everywhere):
var ws = require("nodejs-websocket")
var os = require('os');
Tail = require('tail').Tail;
tail = new Tail('./testlog.txt');
var server = ws.createServer(function (conn) {
conn.on("text", function (str) {
console.log("Received " + str);
});
conn.on("close", function (code, reason) {
console.log("Connection closed");
});
}).listen(8001);
setInterval(function(){ checkLoad(); }, 5000);
function broadcast(mesg) {
server.connections.forEach(function (conn) {
conn.sendText(mesg)
})
}
var load = '';
function checkLoad(){
var new_load = os.loadavg().toString();
if (new_load === 'load'){
return;
}
load = new_load;
broadcast(load);
}
tail.on("line", function(data) {
broadcast(data);
});
Obviously this is very basic and you will have to change it for your needs.
I had made a similar implementation recently using Munin . Munin is a wonderful server monitoring tool, open source too which also provides a REST API. There several plugins available for your needs monitoring CPU, HDD and RAM usage of your server.
You need to build a push notification server. All clients who are listening, will then get a push notification when new data is updated. See this answer for more information: PHP - Push Notifications
As to how you would update the data, I'd suggest using OS-based tools to trigger a PHP script (command line) that will generate an "push" the json file out to any client currently listening. Any new client logging on to "listen" will get served the current json available, until it's updated.
This way you're not subject to 100 users using 100 connections and how much ever bandwidth to poll your server every 5 seconds, and only get updated when they need to know there's an update.
How about a service that reads all the log info (via IPMI, Nagios or whatever) and creates the output files on some schedule. Then anyone that wants to connect can just read this output rather than hammering the server logs. Essentially have one hit on the server logs then everyone else just reads a web page.
This could be implemented pretty easily.
BTW: Nagios has a v nice free edition
Answering just these bits of your question:
performant way to stream the data to the various clients (simultanously).
be able to send commands to the server.. (bidirectional)
using web languages (js,php...), lunix commands( something that is easy to implement on multiple machines).. free software if needed.
I'll recommend the Bayeux protocol as made simple by the CometD project. There are implementations in a variety of languages and it's really easy to use in its simplest form.
Meteor is broadly similar. It's an application development framework rather than a family of libraries, but it solves the same problems.
Some suggestions:
Munin for charts
NetSNMP (used by Munin, but you can also use Bash and Cron to build traps that send SMS texts on alerts)
Pingdom for remote alerts about how well the server is responding to ping and HTTP checks. It can SMS text you or call a phone, as well as have call escalation rules.

node.js - how to push an address change to a client?

I am trying to set up a simple presentation using three computers synchronized by a central server, and I figured node would be the ideal tool.
I was wondering if there's any way to have all three computers connect to the server via the browser, and if I could control the server to push changes to each?
For example:
Computer-1 visits 10.0.0.1?comp=1?slide=1
Computer-2 visits 10.0.0.1?comp=2?slide=1
Computer-3 visits 10.0.0.1?comp=3?slide=1
Then from the server commandline, I would like to be able to trigger a change so the clients will each be redirected accordingly like so:
Computer-1 visits 10.0.0.1?comp=1?slide=2
Computer-2 visits 10.0.0.1?comp=2?slide=2
Computer-3 visits 10.0.0.1?comp=3?slide=2
I'm new to node, so I'm not even sure if this is the ideal platform, but was wondering what terminology I should be researching to be able to build something like this?
Thank you for your responses, I ended up looking into socket.io and managed to write this system in one evening! Node + socket.io and express is a pretty amazing tool with the socket emit events.
Thank you for pointing me in the right direction, this is exactly the tool I was looking for.
Just in case it may help anyone, in my client/jade template, I have something like:
socket.on('slideUpdate', function (data) {
// Do things with the data
}
and on the server app.js:
io.on('connection', function (socket) {
socket.on('slideChange', function (data) {
// logic for setting slide data
io.sockets.emit('slideUpdate', { example: exampleData ... });
});
});
where a slideChange event is triggered via a button on the client-side template.
For such a presentation I would use the most simple solution, i.e. not websockets, not server-sent events and not long-polling.
Just do a short poll, i.e. every client calls the server every 100ms for updates. The server then responds with a status update (if there is one).

How do I send a channel message using channel.trigger with websocket-rails gem

I'm building a simple real-time chat app to learn how to use websockets with RoR and I don't think I'm understanding how channels work because they're not doing what I expect. I can successfully send a message to my Rails app using the dispatcher.trigger() method, and use my websocket controller to broadcast a message to all clients that subscribe to the channel. That all works fine. What does NOT work is using a channel (via the channel.trigger() method) to send a message to other clients. The websocket-rails wiki says...
Channel events currently happen outside of the Event Router flow. They
are meant for broadcasting events to a group of connected clients
simultaneously. If you wish to handle events with actions on the
server, trigger the event on the main dispatcher and specify which
controller action should handle it using the Event Router.
If I understand this correctly, I should be able to user the channel.trigger() method to broadcast a message to clients connected to the channel, without the message being routed through my RoR app, but it should still reach the other connected clients. So here's my code...
var dispatcher = new WebSocketRails('localhost:3000/websocket');
var channel = dispatcher.subscribe('channel_name');
channel.bind('channel_message', function(data) {
alert(data.message);
});
$("#send_message_button").click(function() {
obj = {message: "test"};
channel.trigger('channel_message', obj);
});
With the code listed above, I would expect that when I click the button, it sends a channel message using channel.trigger() and the channel_message binding should be executed on all clients, displaying an alert that reads "test". That doesn't happen. I'm using Chrome tools to inspect the websocket traffic and it shows the message being sent...
["channel_message",{"id":113458,"channel":'channel_name',"data":{"message":"test"},"token":"96fd4f51-6321-4309-941f-38110635f86f"}]
...but no message is received. My questions are...
Am I misunderstanding how channel-based websockets work with the websocket-rails gem?
If not, what am I doing wrong?
Thanks in advance for all your wisdom!
I was able to reproduce a working copy based on an off-the-shelf solution from the wiki along with your very own code.
I've packaged the whole thing here. The files you might be interested are home_controller.rb, application.js and home/index.html.erb.
It seems your understanding of channel-based websockets is correct. About the code, make sure to load the websocket javascript files and to enclose your code inside a document.ready. I had the exact same problem you're having without the latter.
//= require websocket_rails/main
$(function() {
// your code here...
});
Let me know if it works. Best Luck!

Categories

Resources