Socket.io making duplicate when emitting - javascript

Heading
Im emitting games from a .json file, it all works fine
When i restart my js code, it emits the games and
socket.on('getgames', function (data) {
socket.emit('updategames', {games:games});
});
This is supposed to make a div class="game-:gameid", it works fine until either theres a connection error or i restart my js code. Then it will make duplicate divs and emit it twice, i dont want duplicate.

This problem will occur then you call on:
socket.on('getgames', function (data) {
socket.emit('updategames', {games:games});
});
multiple times. This will course socket.io to subscribe to your event 'getgames' multiple times.
My guess is that you call this code every time you restart your game. To fix it you either unsubscribe from the event or just call the code one time.

Related

What happens when an onCreate event happens again while another Firebase function is already running?

Background
I have a Firebase Cloud Function which sometimes can take a while to complete (20-60 seconds).
It gets triggered by a write to Firebase, which means that it starts processing at the onCreate event.
Problem
I have been doing some scripted testing by creating a new record in Firebase every N seconds, but it seems that if N is less than 20 seconds, the next onCreate trigger just doesn't fire.
In other words I end up in a situation like this:
Firebase:
record1
record2
record3
record4
Results written by the triggered function to another node in Firebase:
result-from-record1
...
record2, record3, record4 does not seem to trigger the function again.
Homework
I have re-checked Firebase documentation, but I cannot seem to find any information that explains this case.
There is some information about quotas for connected users, but it's only about connected users, not about the same triggers firing many times before the previously triggered function completes.
Questions
What is the default behavior of Firebase triggered functions in case it gets triggered while the previously triggered function is still running?
Is there any way to maybe cancel the running function if it gets triggered by a new onWrite?
Is there any queue of those triggered and running functions? (this queue doesn't seem to be the one)
What is the default behavior of Firebase triggered functions in case it gets triggered while the previously triggered function is still running?
There is no guarantee about how functions are invoked - they could happen in sequence on a single server instance, or they could run in run in parallel on multiple server instances. The order of invocation of functions is also not guaranteed.
Is there any way to maybe cancel the running function if it gets triggered by a new onWrite?
No.
Is there any queue of those triggered and running functions? (this queue doesn't seem to be the one)
There is no visible queue. Internally, Cloud Functions is using pubsub to manage the stream of events emitted by the database, but this is an implementation detail, and you have no direct control over how it works.
As for why your function doesn't seem to execute when you expect - there's not enough detail in your question to make a guess. Without seeing actual code, as well as the specific steps to take to reproduce the issue, it's not possible to say.
You might want to watch my video series on how Cloud Functions works in order to better understand its behavior.

Debugging why socket.on is not being hit

This issue is occurring on the client side of my application.
I am running into a issue with the socket.on event not being hit on my Node application. This usually happens when the site is being loaded for the first time by a client. When loading, the tabs usually have the data which has been dynamically generated in Javascript, however since the socket.on event is not being hit this content is not loading, leading to the tabs being blank (The base Pug/Jade file is still rendering - just no dynamic content). Refreshing the page usually fixes the issue and the sockets start receiving the data again.
There is no direct error message being output, the socket.on event is just not being hit. This also seems to solve itself once the page has been reloaded and is not occurring 100% of the time.
This is even more unusual, as when running my Node server in debug mode it is showing the socket.io-parser encoding the data and socket.io-client writing the data, meaning that it is being emitted, just not being picked up by the socket.on on my client side.
Note that this has been simplified down
io.sockets.on('connection', function (client) {
sendExampleData();
});
function sendExampleData() {
mysqlFunc.getOrderData(function(rows){
io.emit('example_data', rows);
})
}
socket.on('example_data', function (rows) {
console.log(rows);
}
I expect that on page load, the data is emitted and the socket.on event will receive this data, ready to be processed. Currently, although not happening every-time, socket.on is not being hit and the data is not being grabbed.
There is nothing out of the ordinary whilst running the server in Debugging mode and any ideas would be extremely helpful. If there's any other bits of information that would be useful in helping debug this issue please let me know.
Thanks in advance!
I finally managed to figure out what the issue was.
My socket.on event listeners were inside my $(document).ready and were not being called on first page load. I'm assuming that the event listeners were not being defined early enough and therefore when I socket.emit'd on connection the event listeners had not actually loaded.
I moved all of the logic which was inside the $(document).ready out and moved my event listeners into a connection event (Thanks for the tip Marc).
socket.on('connect', function(){
socket.on('data_1', funcOne);
socket.on('data_2', funcTwo);
socket.on('data_3', funcThree);
socket.on('data_4', funcFour);
socket.on('data_5', funcFive);
});
This fixed the issue and now socket.on is being hit every time on page load.
Thanks for the help guys and hopefully this can help someone also having this issue!

Keep Node Server running "watching" for things to occur

I think this question is due to a lack of understanding on node, but i'm working on created a motion sensor with the raspberry pi and node.
I don't understand how to keep my node server running. I can get it to work as intended using setInterval but I don't think this is how I should be doing it.
Basically I want to be able to start the program with node index.js and have it continue watching the GPIO pins that the sensor is connected to see if something happens. If something happens then it does something, but keeps watching the sensor in case more happens.
What I have done to keep it running is similar to this:
var foo = require('require necessary things up here');
setInterval(function(){
//code for detecting sensor stuff here
}, 1000)
This works, but I know I don't think it's the right way to do it.
If I do something like the below it just executes the functions, logs to the console, but doesn't watch for changes and just exits out.
var foo = require('require necessary things up here')
function checkForSensorStuff(){
//code for detecting sensor stuff here
console.log('checking stuff')
}
How can I keep the server running so that it just continually watches for changes in a function without using setInterval?
A node process exits when it has nothing else left to do. You start a node process by running a startup script line by line. When that script finishes executing, if there's nothing else to do (no timers, no open sockets listening for incoming connections, etc...), then it shuts down because there's nothing left that could cause an event and cause some action on the server.
So, if you want your server to continue running, you have to give it some way for future events to occur. As you've discovered a recurring timer is one way to do that. There should be no reason to use a timer purely for keeping your server running. Instead you need to configure something in your server that will trigger events in the future. If you don't have anything that will cause future events, then you may need to use setInterval() to regularly poll some status to decide if there's something waiting to do.
If you're trying to monitor GPIO status on your Raspberry Pi in node.js, you can use the pigpio library and it will offer an event driven way of watching for GPIO changes. This should automatically keep your server running.
Another option for getting events upon GPIO changes is the onoff library.
I have a Raspberry Pi being used as a temperature controller that reads two GPIO temperature probes. I'm just using a setInterval() timer to poll the temperature readings every 10 seconds and that works just fine too.
I don't understand how to keep my node server running
You need to open an handle to a resource like a socket or a file. If there is no active handles opened, node exits. In your example setInterval is timer which prevents node to exit.
see also process._getActiveHandle
https://github.com/myndzi/wtfnode

duplicates in socket.io w/ angular

Using angular and socket.io I am getting duplicate events on the client everytime the server emits. Angular is only included once, there is only one socket.io connection, and there is only one listener per event on the client. Upon receiving an event on the server, data is logged, and this process only ever happens once. Then the data is emitted and the callback is called twice on the client, despite only being in scope once(to my knowledge).
client:
//inside a controller
var thing ='foo';
socket.emit('sentUp',thing)
socket.on('sentDown',function(thing){
console.log(thing)//this happens twice
});
server:
/*
node&express stuff here
*/
socket.on('connection',function(socket){
socket.on('sentUp',function(stuff){
console.log('this happened')//x1
socket.emit('sendDown',stuff);
});
})
Most likely your controllers are being loaded more than once. You can easily check it by logging.
Move out the socket code from the controllers and put them in a service where they're only called once.
I have found in my own socket.io client code that some of the connect events can occur each time the client reconnects. So, if the connected is lost for any reason and then the client automatically reconnects, the client may get a connect event again.
If, like me, you're adding your event handlers in the 'connect' event, then you may be accidentially adding multiple event handlers for the same event and thus you would think you were seeing duplicate data. You don't show that part of your client code so I don't know you're doing it that way, but this is an issue that hit me and it is a natural way to do things.
If that is what is happening to you, there are a couple possible work-arounds:
You can add your event handlers outside the connect event. You don't have to wait for connection to finish before adding event handlers. This way, you'd only ever do them once.
Before adding the event handlers you add upon connection, you can remove any previous event handlers that were installed upon connection to make sure you never get dups.

Meteor.jd, realtime latency with complex pusblish/subscribe rules?

I'm playing with realtime whiteboards with meteor. My first attempt was working very well, if you open 2 browsers and draw in one of them, the other one updates in a few milliseconds ( http://pen.meteor.com/stackoverflow )
Now, my second project, is to make an infinite realtime whiteboard. The main thing that changes now, is that all lines are grouped by zones, and the viewer only subscribe to the lines in the visible zones. And now there is a dealy of 5 seconds (!) when you do something in one browser to see it happen in the other one ( http://carve.meteor.com/love ).
I've tried to add indexes in the mongo database for the fields determining the zones.
I've tried updating the Collection only for a full line (and not each time I push a new point like i my first project).
I've tried adding a timeout not to subscribe too often when scrolling or zooming the board.
Nothing changes, always a 5 seconds delay.
I don't have this delay when working locally.
Here is the piece of code responsible for subscribing to the lines you the visible area :
subscribeTimeout=false;
Deps.autorun(function () {
var vT=Session.get("visible_tiles");
var board_key=Session.get("board_key");
if (subscribeTimeout) Meteor.clearTimeout(subscribeTimeout);
subscribeTimeout=Meteor.setTimeout(subscribeLines, 500);
});
function subscribeLines() {
subscribeTimeout=false;
var vT=Session.get("visible_tiles");
console.log("SUBSCRIBE");
Meteor.subscribe("board_lines", Session.get("board_key"),vT.left,vT.right,vT.top,vT.bottom, function() {
console.log("subscribe board_lines "+Session.get("board_key"));
});
}
I've been a SysAdmin for 15 years. Without running the code, it sounds like an imposed limitation of the meteor.com server. They probably put in delays on the resources so everyone gets a fair share. I'd publish to another server like heroku for an easy deploy or manually to another server like linode or my favorite Joyent. Alternatively you could try and contact meteor.com directly and ask them if/how they limit resource usage.
Since the code runs fast/instantly locally, you should see sub-second response times from a good server over a good network.

Categories

Resources