Node.js sockets, what happen if no FYN packet? - javascript

I have two node.js server, one of them connect via socket to the other and send data regulary.
Exemple could be:
client.js:
io = require('socket.io');
var socket = io.connect(IP_SERVER + ':' + PORT);
socket.on('connect', function () {
setInterval(socket.emit('data', new Date()), 60000);
});
server.js
var app = http.createServer(),
io = require('socket.io').listen(app);
app.listen(PORT);
io.sockets.on('connection', function(socket) {
socket.on('data', function(m) {
console.log(m);
});
});
Now, while the two are communicating, I shut down server.js (ctr-C for example). Is any event pop up in client.js? Is client.js will auto connect once server rebooted?
I ve tried on my own, by adding listener for 'end', 'error' and 'close'. None where started, but once server.js online, it got data anew... Can t I know when this happen?
UPDATE: Well, it seems it end up reconnecting, but the time it take seems random, I can t find anything about this in docs...

By default, it will try to reconnect after losing a connection. There is also a setting (reconnectionDelay) to let socket.io know how long to wait before attempting to reconnect, and there is also a setting (reconnectionDelayMax) that specifies the maximum amount of time to wait between reconnections. Each attempt increases the reconnection by the amount specified by reconnectionDelay.
Full documentation is here

Related

Socket.io - Multiple connections when there should only be one per browser window

I've noticed that when loading into my app it creates multiple socket connections.
This bug seems quite intermittent, I've been looking around for a solution but nothing seems to be working thus far.
Here's my server-side code:
var express = require('express');
var app = express();
var serv = require('http').Server(app);
var path = require('path');
app.get('/', function (req, res) {
res.sendFile(__dirname + '/client/index.html');
});
app.use('/client',express.static(__dirname + '/client'));
app.use(express.static(path.join(__dirname, 'skin')));
serv.listen(8081);
console.log("Server started");
var io = require('socket.io')(serv,{});
var socketCount = 0;
io.sockets.on('connection', function(socket){
console.log('Socket ' + socketCount + ' connected');
socketCount++;
socket.emit('socketConnected', true);
socket.on('disconnect',function(){
console.log('Socket ' + socketCount + ' disconnected');
socketCount--;
delete SOCKET_LIST[socket.id];
delete PLAYER_LIST[socket.id];
});
...
When refreshing the page, the disconnect function on the server fires, but only once despite the fact that there is supposedly several sockets. I then get multiple socket connecting, again.
When putting a breakpoint on the frontend (listening for socketConnected) it triggers multiple times. Can anyone point out any flaws in the code above that may cause this issue?
Thanks.
New development after some further debugging:
I've noticed that when I emit socketConnected as seen above, I have assumed that I am able to grab some further data from the frontend.
When sending data from the frontend back to the server again, I run my calculations on the server with it and then run a second socket.emit('something .. from the server to the frontend. - This takes several attempts before actually hitting a breakpoint on the frontend. During these attempts to emit the data from the server, this is where the several other connections are made.
However, if I wrap the second server emit in a setTimeout function it works fine and maintains the current connection without creating multiple more. Has anyone else encountered this issue? - If so, is there a solution that avoids using setTimeout?

Why isn't socket.io callback being triggered inside an express route

I've got the following setup (important bits only for brevity):
app.js
...
const app = express();
const server = require('http').Server(app);
const io = require('socket.io')(server);
server.listen(port, function() {
console.log(`Server is listening on port: ${port}`);
});
io.on('connection', function (socket) {
console.log('connection');
});
const routes = require('./routes/index')(io, passport);
app.use('/', routes);
index.js (server)
router.get('/game/:id', isAuthenticated, (req, res) => {
if (req.id)
{
var game = Game.findOne({_id: req.id}, (err, obj) => {
io.on('getGameInfo', (socket) => {
io.emit('gameInfo', obj);
});
res.render('game', obj);
});
}
else
{
// Id not valid, do something
}
});
client:
const socket = io('http://localhost:3000');
socket.on('gameInfo', function(data) {
console.log(data);
}.bind(this));
socket.on('connect', () => {
socket.emit('getGameInfo');
});
So basically I want to emit a getGameInfo call once I know the client has connected, and the getGameInfo listener has been set up in the game route. But when I emit the getGameInfo from the client, the server callback isn't being hit. I'm not sure if I'm missing something obvious, or if this is a closure issue, or if I'm just having one of those days, or if I'm going about this entirely the wrong way.
There are multiple problems here. I'll start by showing the correct way to listen for an incoming socket.io message on the server:
io.on('connection', function (socket) {
// here's where you have a new socket and you can listen for messages
// on that socket
console.log('connection');
socket.on('gameInfo', (data) => {
socket.emit('gameInfo', obj);
});
});
Some of the issues:
On the server, you listen for messages via the socket object, not via the io object. So, you would typically add these event listeners in the io.on('connection', ...) handler because that's where you first see newly connected sockets.
You pretty much never want to add event listeners inside an Express route handler because that is called many times. In addition, at the moment the route handler is called, the browser has not yet received the page and will not yet be connected so even if this was an OK place to do stuff, the page is not yet connected anyway.
When you want to send a message back to just one connection, you send it with socket.emit(), not io.emit(). io.emit() broadcasts to all connected clients which I don't think is what you want.
I'd suggest you not overload the same message name for client and server to mean two different things as this can lead to confusion when reading code or if you ever share some code between client and server. You client is really sending a "getGameInfo" message and then your server responds with a "gameInfo" message that contains the gameInfo.
If, in a route handler, you want to .emit() to the socket from that page which it looks like you are trying to do, then you have to do some work to create a link between the session of the current page and the socket for that page. There are a number of ways to do that. If you're using any session middleware, you can record the socket in the session at the point the socket connects. Then, from your express routes, you can get that socket from the session object at any time.

NodeJS + Redis + WebSocket memory management?

I have a NodeJS that host a WebSocket Server. The WebSocket redistributes message from Redis.
The full line is, i have some python script that push some data in Redis and after that NodeJS is the WebSocket that reads the Redis newly input data to the connected clients. My problem is that the NodeJs is always taking up memory and after a while it just burst and stops.
I don't know what is my problem, since my code is pretty simple.
I don't need my WebSocket to receive message from the connected clients, since i only need to push them data, but alot of data.
var server = require('websocket').server,
http = require('http');
var redis = require("redis"),
client = redis.createClient();
var socket = new server({
httpServer: http.createServer().listen(443),
keepalive: false
});
client.subscribe("attack-map-production");
socket.on('request', function(request) {
var connection = request.accept(null, request.origin);
connection.on('message', function(message) {
console.log(message);
client.on("message", function(channel, message){
connection.send(message);
});
});
connection.on('close', function(connection) {
console.log('connection closed');
});
});
I'm looking to make this work without eating all the memory on my server and possibly make this much more fast, but i think it's fast enough.
Maybe NodeJS is not meant for this kind of work?
Any help is appreciated.
Thanks.
Update 2016-11-08
With the information provided below, i have "updated" my code. The problem is still there, i will continue to look around to find a answer... but i'm really not getting this.
var server = require('websocket').server,
http = require('http');
var redis = require("redis"),
client = redis.createClient();
var socket = new server({
httpServer: http.createServer().listen(443),
keepalive: false
});
client.subscribe("attack-map-production");
socket.on('request', function(request) {
var connection = request.accept(null, request.origin);
client.on("message", function(channel, message){
connection.send(message);
});
connection.on('close', function(connection) {
console.log('connection closed');
});
});
Update 2016-11-16
So here is my new code:
var io = require('socket.io').listen(443);
var sub = require('redis').createClient();
io.sockets.on('connection', function (sockets) {
sockets.emit('message',{Hello: 'World!'});
sub.subscribe('attack-map-production'); // Could be any patterni
sockets.on('disconnect', function() {
sub.unsubscribe('attack-map-production');
});
});
sub.on('message', function(channel, message) {
io.sockets.json.send(message);
});
Even this code, makes nodejs go at 100% CPU and even more, and it starts to go really slow, until everything just stops.
The complet flow of my data, is that a python script pushes data into Redis, and throught my subscribtion it pushes my data back to the browser by a webSocket and Socket.io.
That simple, how can this be slow? I just don't get it.
client = redis.createClient();
take a look at this line , everytime you invoke the variable client , you create an instance of redis client inside node , and you never close it. so if you recieve 10000 socket 'request' , you will also have 10000 redis instances.
You need to call the command client.quit() once the write or the read to redis is done
var server = require('websocket').server,
http = require('http');
var redis = require("redis"),
client = redis.createClient();
var socket = new server({
httpServer: http.createServer().listen(443),
keepalive: false
});
client.subscribe("attack-map-production");
socket.on('request', function(request) {
var connection = request.accept(null, request.origin);
client.on("message", function(channel, message){
connection.send(message);
});
client.quit(); // MISSING LINE
connection.on('close', function(connection) {
console.log('connection closed');
});
});
and i also noticed this piece of code
httpServer: http.createServer().listen(443)
the port 443 is for https ! so if you are using a secured connection you need to call the module https not http, like this
var socket = new server({
httpServer: https.createServer().listen(443),
keepalive: false
});
hope it helps !
Maybe NodeJS is not meant for this kind of work?
If node is meant for something it's this. I/O stream and reading/writing is the main advantage of node asynchronism.
On what kind of server are you running this ? In a too small EC2 instance you can hit some memory problem.
Else it's a leak. That's kind of hard to trace.
Code is small thought.
I would remove any console.log just in case.
connection.on('message', function(message) {
console.log(message);
client.on("message", function(channel, message){
connection.send(message);
});
});
This part feel suspicious, two variables with the same name, an unused variable, it calls for trouble, and I don't really get why you have to listen for connection message in order to wait for redis message.

Laravel Broadcast with Redis not working/connecting?

So I'm trying to broadcast Laravel 5 Events with the help of Redis. No I don't wanna use a service like Pusher since it's not free (even if the free limit would be enough for me) and I wanna keep control of the broadcast server.
So what I've done so far is, I'Ve set up a redis server (listening on port 6379 -> default), I've set up the following event:
class MyEventNameHere extends Event implements ShouldBroadcast
{
use SerializesModels;
public $data;
/**
* Create a new event instance.
*
* #return \App\Events\MyEventNameHere
*/
public function __construct()
{
$this->data = [
'power' => 10
];
}
/**
* Get the channels the event should be broadcast on.
*
* #return array
*/
public function broadcastOn()
{
return ['pmessage'];
}
}
I registered a route to that event:
Route::get('test',function()
{
event(new App\Events\MyEventNameHere());
return "event fired";
});
I've created (more like copied :P) the node socket server:
var app = require('http').createServer(handler);
var io = require('socket.io')(app, {origins:'*:*'});
var Redis = require('ioredis');
var redis = new Redis();
app.listen(6379, function() {
console.log('Server is running!');
});
function handler(req, res) {
res.writeHead(200);
res.end('');
}
io.on('connection', function(socket) {
console.log(socket);
});
redis.psubscribe('*', function(err, count) {
});
redis.on('pmessage', function(subscribed, channel, message) {
console.log(message);
message = JSON.parse(message);
io.emit(channel + ':' + message.event, message.data);
});
And I created the view to actually receive the broadcast (testview.blade.php):
#extends('layout')
#section('content')
<p id="power">0</p>
<script>
var socket = io('http://localhost:6379');
socket.on("pmessage:App\\Events\\MyEventNameHere", function(message) {
console.log(message);
$('#power').text(message.data);
});
console.log(socket.connected);
</script>
#endsection
I can launch the redis server without any problems.
I can launch the node socket.js server and I'm getting the response "Server running"
When I hit the route to the event I get the return "event fired" in my browser.
When I hit the route to the actual view
Route::get('test/view',function()
{
return view('testview');
});
I can see the whole page (layout is rendered), and the webconsole does not show any errors.
However if I fire the event, the view won't change, which means, the broadcast is not received right?
Now I included an output for the console
console.log(socket.connected);
which should show me if the client is connected to the socket.io right?
Well, the output says false. What am I doing wrong here?
Further information on my setup: I'm running the whole project on the php built-in server, the whole thing is running on Windows (if ever that could matter), my firewall is not blocking any of the ports.
EDIT 1:
I forgot to say that my node server is not receiving the messages as well... It only says "Server running", nothing else.
EDIT 2:
I used another socket.js:
var app = require('express')();
var http = require('http').Server(app);
var io = require('socket.io')(http);
var Redis = require('ioredis');
var redis = new Redis();
app.get('/', function(req, res) {
res.sendFile(__dirname + '/index.html');
});
redis.subscribe('test-channel', function () {
console.log('Redis: test-channel subscribed');
});
redis.on('message', function(channel, message) {
console.log('Redis: Message on ' + channel + ' received!');
console.log(message);
message = JSON.parse(message);
io.emit(channel, message.payload)
});
io.on('connection', function(socket) {
console.log('a user connected');
socket.on('disconnect', function() {
console.log('user disconnected');
});
});
http.listen(3000, function() {
console.log('listening on *:3000');
});
And this time the console receives the messages.
So if the node socket.io receives the messages, then what's wrong with my client? Obviously the messages are being broadcasted correctly, the only thing is that they are not being received by the client...
I can't say what is exactly wrong and probably no one can't, because your problem is to broad and enviroment dependent. Using Wireshark Sniffer you can easily determinate part of solution that is not working correctly and then try find solution around actual problem.
If your question is about how to do that, I will suggest not involving node on server side and use .NET or Java language.
The problem with your code is you are connecting your client socket to the redis default port 6379 rather than the node port that is 3000.
So in your blade view change var socket = io('http://localhost:6379'); to var socket = io('http://localhost:3000');
have you tried to listen to the laravel queue, from command line, before to fire the event?
php artisan queue:listen

Update all clients using Socket.io?

Is it possible to force all clients to update using socket.io? I've tried the following, but it doesn't seem to update other clients when a new client connects:
Serverside JavaScript:
I'm attempting to send a message to all clients, which contains the current number of connected users, it correctly sends the amount of users.... however the client itself doesn't seem to update until the page has been refreshed. I want this to happen is realtime.
var clients = 0;
io.sockets.on('connection', function (socket) {
++clients;
socket.emit('users_count', clients);
socket.on('disconnect', function () {
--clients;
});
});
Clientside JavaScript:
var socket = io.connect('http://localhost');
socket.on('connect', function(){
socket.on('users_count', function(data){
$('#client_count').text(data);
console.log("Connection");
});
});
It's not actually sending an update to the other clients at all, instead it's just emitting to the client that just connected (which is why you see the update when you first load)
// socket is the *current* socket of the client that just connected
socket.emit('users_count', clients);
Instead, you want to emit to all sockets
io.sockets.emit('users_count', clients);
Alternatively, you can use the broadcast function, which sends a message to everyone except the socket that starts it:
socket.broadcast.emit('users_count', clients);
I found that using socket.broadcast.emit() will only broadcast to the current "connection", but io.sockets.emit will broadcast to all the clients.
here the server is listening to "two connections", which are exactlly 2 socket namespaces
io.of('/namespace').on('connection', function(){
socket.broadcast.emit("hello");
});
io.of('/other namespace').on('connection',function(){/*...*/});
i have try to use io.sockets.emit() in one namespace but it was received by the client in the other namespace. however socket.broadcast.emit() will just broadcast the current socket namespace.
As of socket.io version 0.9, "emit" no longer worked for me, and I've been using "send"
Here's what I'm doing:
Server Side:
var num_of_clients = io.sockets.clients().length;
io.sockets.send(num_of_clients);
Client Side:
ws = io.connect...
ws.on('message', function(data)
{
var sampleAttributes = fullData.split(',');
if (sampleAttributes[0]=="NumberOfClients")
{
console.log("number of connected clients = "+sampleAttributes[1]);
}
});
You can follow this example for implementing your scenario.
You can let all of clients to join a common room for sending some updates.
Every socket can join room like this:
currentSocket.join("client-presence") //can be any name for room
Then you can have clients key in you sockets which contains multiple client's data(id and status) and if one client's status changes you can receive change event on socket like this:
socket.on('STATUS_CHANGE',emitClientsPresence(io,namespace,currentSocket); //event name should be same on client & server side for catching and emiting
and now you want all other clients to get updated, so you can do something like this:
emitClientsPresence => (io,namespace,currentSocket) {
io.of(namespace)
.to(client-presence)
.emit('STATUS_CHANGE', { id: "client 1", status: "changed status" });
}
This will emit STATUS_CHANGE event to all sockets that have joined "client-presence" room and then you can catch same event on client side and update other client's status.
According to this Broadcasting.
With nodejs server, you can use this:
io.emit('event_id', {your_property: 'your_property_field'});
Be sure to initialise websocket, for example:
var express = require('express');
var http = require('http');
var app = express();
var server = http.Server(app);
var io = require('socket.io')(server);
app.get('/', function (req, res) {
res.send('Hello World!');
io.emit('event_hello', {message: 'Hello Socket'});
});
server.listen(3000, function () {
console.log('Example app listening on port 3000!');
});
In this case, when user reach your server, there will be "event_hello" broadcasted to all web-socket clients with a json object {message: 'Hello Socket'}.

Categories

Resources