I have a react app that uses socket io to send for and receive all data. I created a global socket variable to be shared across all components...
export let gameScoket = null
export const update = (newSocket) => gameScoket = newSocket
I then set the socket in 'Home.jsx' and make a few calls...
update(io("ws://localhost:8888"))
socket = gameScoket
socket.on('...')
The problem arose when adding a callbacks to these calls. The callbacks seems to be called a random (very large) amount of times, increasing every time the socket is used. An example of this can be seen in these three sockets in 'Game.jsx'...
socket.on("question-update", (data) => {
console.log("Calling question-update")
const responce = JSON.parse(data)
setQuizData(responce.data)
})
socket.on("point-update", (data) => {
console.log("Calling point-update")
const responce = JSON.parse(data)
setUsrData(responce.data)
})
socket.on("personal-point-update", (data) => {
console.log("Calling personal-point-update")
const responce = JSON.parse(data)
setClientScore(responce.data)
})
Whilst there is no evidence of the client spamming the server with requests, the console is bombed with messages, and the state is re-updated so many times the app becomes unresponsive and crashes. Here's a screenshot of the console...
I don't know where I went wrong with my implementation and would love some advice, thanks.
try to use socket.once("",(data)=>{});
The bug in your code is that each time a new connection() is called node registers an event listener 'Quest'. So first time new connection() is called the number of event listeners with event 'Quest' is one, the second time function is called, the number of event listeners increases to two, and so on
socket. once() ensures that the number of event listeners bound to a socket with event 'Quest' registered is exactly one
Make sure to keep all 'socket.on()' call within useEffect to prevent duplication.
useEffect(() => {
socket.on('...')
}, [])
Related
I have an issue where when i call socket.emit() on the socket object after i instantiate the socket object it does not do anything.
Like calling the socket emit function from the constructor when React app loads.
But when i attach the same emit to a click event like clicking a button then it works correctly. Is this because of some some asynchronous process inside the socketio module?
for eg:
socket = io(<someurl>);
socket.emit('something') // this does not work
but calling the same thing inside a click event fires the emit.
Yes. The socket is not connected yet, even if just for a few ms. Try sending that initial emit when connected:
const socket = io(); // with your parameters
socket.on('connect', () => {
socket.emit('something');
});
Here's a list of events.
When I look at tutorials/documentation about WebSockets I find code like this:
var ws = new WebSocket("ws://localhost:8765/dlt");
ws.onopen = () => {
// do some very important stuff after connection has been established
console.log("onopen");
}
But what about race conditions here? Are there somehow avoided in JavaScript?
For example this code (which just assigns onopen after the connection has been opened) will fail:
var ws = new WebSocket("ws://localhost:8765/dlt");
setTimeout(() => {
ws.onopen = () => {
// do some very important stuff after connection has been established
console.log("onopen"); /// <== won't be called
}
}, 100);
Can I be sure that the assignment has been done before the connection get's established?
(I tried to extend WebSocket with a custom onopen() method but this doesn't seem to work)
class MyWebSocket extends WebSocket {
onopen() {
console.log("onopen()");
/// do some very important stuff after connection has been established
}
}
You should have a read about javascript's event loop: https://developer.mozilla.org/en-US/docs/Web/JavaScript/EventLoop#Event_loop
If you look at the section about Run-to-completion, you get this useful explanation:
Each message is processed completely before any other message is processed. This offers some nice properties when reasoning about your program, including the fact that whenever a function runs, it cannot be pre-empted and will run entirely before any other code runs (and can modify data the function manipulates). This differs from C, for instance, where if a function runs in a thread, it may be stopped at any point by the runtime system to run some other code in another thread.
So in your example, the assignment to ws.onopen must be completed before the websocket does anything asynchronous in nature. By putting your assignment inside setTimeout, you are moving it outside of the currently running context, and so it may not be executed before it is required by the websocket.
You should rest assured that the example is ok. The Javascript event loop will finish the current task before assuming any other tasks. This means that 1) the WebSocket cannot open the connection (async operation) before the onopen event, 2) the onopen event handler will be called during the following cycles.
Setting the timeout on the other hand will complicate matters, because the events will be called in some order after the current task. This means that that the WebSocket has chance to open the connection before the handler has been set.
I'm wondering if there is a way to manually cause a simplepeer Peer object to fire of the signal data event that does of when a initiating peer is created.
More precisely what I want to do is to create Peer objects with the initiator property set to false and then manually turn them into initiating peers.
Is this possible or should I look elsewhere?
Update 1:
I tried to create a non initiating peers and then set initiator to true via function but that did not trigger the signal event.
so basically this:
let p = new SimplePeer();
p.on('signal', function (data) {
console.log('SIGNAL', JSON.stringify(data))
});
const turnOn = function () {
p.initiator = true;
};
turnOn();
*This is not the actual code, just the parts that have to do with the question.
I have ReactJs app with Firebase as database/backend.
There are 1000 items that are synced with firebase and i get new data with
firebase.database().ref().child('path').on('value', snapshot => {...})
while normal usage, users are changing data of that list, it all updates for all and all happy. ReactJs side is optimised well, it all works w/o lag.
Problem if when i run cronjob updating all those items with some 3rd party data and every user get's on event fired by firebase 1000 times at once one after the other. Browser freezes, RAM usage ~2-3GB (normal usage ~200mb).
There are 2 options what i'm thinking about, maybe you could add something:
Make cron update step-by-step update 1 item per second so fire all those 1000 on events during all 15 minutes timespan and just run cron forever, when it's done updating last item, just start again.
Make some abstraction layer for firebase connection and if on event is fired like 5 times /s disconnect with off and then after couple seconds re-connect with on so i would be disconnected during batch updates.
I like #2 more because then i can do whatever needed with cronjob and it also solves any possible future issues with batch updates.
It seems like your browser process is getting overloaded by the number of items that are queued up by the batch processing.
The best solution is probably to limit the amount of data that the browser process pulls in from Firebase to something it can handle:
firebase.database().ref().child('path')
.limitToFirst(10)
.on('child_added', snapshot => {...})
This way your browser will never see more than 10 items.
You'll note that I switched from a value event to child_added, which will probably also help. If you have a queue, you should probably also handle child_removed to remove the item from the browser UI. So you'll end up with something like:
var query = firebase.database().ref().child('path').limitToFirst(10)
query.on('child_added', snapshot => {
// add item to UI
})
query.on('child_removed', snapshot => {
// remove item from UI
})
I know, that for most cases it would be possible to limit results with limitToFirst() as Frank van Puffelen suggested, but problem is I need all data available in UI for all kind of selections, auto completions not talking about fast data filtering w/o extra call to server
Current solution:
I listen for all coming events from firebase if buffer overloads i just use goOffline and then re-connect after some time. Client don't see any difference, because goOffline lets fire local events normally and all UI works well read more here: https://firebase.google.com/docs/reference/js/firebase.database.Database#goOffline
I add event listener for each child in firebase, i'd like to have one global listener, but it doesn't work for me, there is connected question: How to listen for ALL firebase .on('value') events on root node?
let database = window.firebase.database();
// go offline if incomes more events per second from firebase
let events_buffer = 0;
let online = true;
const go_online = () => {
if (events_buffer > 6) {
setTimeout(() => {
go_online();
}, 3000);
} else {
online = true;
database.goOnline();
}
};
const handle_buffer = () => {
if (events_buffer > 6) {
if (online) {
online = false;
database.goOffline();
setTimeout(() => {
go_online();
}, 5000);
}
}
};
setInterval(() => {
events_buffer = events_buffer > 0 ? events_buffer - 1 : events_buffer;
}, 500);
// listen for all events. Connected: https://stackoverflow.com/questions/38368328/how-to-listen-for-all-firebase-onvalue-events-on-root-node
database.ref().child('clients').on('value', () => {
events_buffer++;
handle_buffer();
});
database.ref().child('users').on('value', () => {
events_buffer++;
handle_buffer();
});
//
// ... add listeners for all childs
//
export const firebase = window.firebase;
https://gist.github.com/liesislukas/0d78b6ac9613b70bafbb8e1601cc740e
for max buffer value i use 6 because on initial load app requires ~3-4. 6 can be reached on with some anomaly with my app.
I've written happily a node.js server, which uses socket.io to communicate with the client.
this all works well.
the socket.on('connection'...) handler got a bit big, which made me think of an alternative way to organize my code and add the handlers in a generator function like this:
sessionSockets.on('connection', function (err, socket, session) {
control.generator.apply(socket, [session]);
}
the generator takes an object that contains the socket events and their respective handler function:
var config = {
//handler for event 'a'
a: function(data){
console.log('a');
},
//handler for event 'b'
b: function(data){
console.log('b');
}
};
function generator(session){
//set up socket.io handlers as per config
for(var method in config){
console.log('CONTROL: adding handler for '+method);
//'this' is the socket, generator is called in this way
this.on(method, function(data){
console.log('CONTROL: received '+method);
config[method].apply(this, data);
});
}
};
I was hoping that this would add the socket event handlers to the socket, which it kind of does, but when any event comes in, it always calls the latest one added, in this case always the b-function.
Anyone any clues what i am doing wrong here?
The problem appears because by that time this.on callback triggers (let's say in a few seconds after you bind it), the for loop is finished and method variable becomes the last value.
To fix that you may use some JavaScript magic:
//set up socket.io handlers as per config
var socket = this;
for(var method in config){
console.log('CONTROL: adding handler for '+method);
(function(realMethod) {
socket.on(realMethod, function(data){
console.log('CONTROL: received '+realMethod);
config[realMethod].apply(this, data);
});
})(method); //declare function and call it immediately (passing the current method)
}
This "magic" is hard to understand when you first see it, but when you get it, the things become clear :)