I'm new to node.js and discord.js, the previous version of this bot was written in discord.py (now deprecated).
This function iterates through all the webhooks (id and token) in my SQL database and sends a message to each of them. There are about 1500 of them (one for each server). I have to send a message roughly every 5 seconds. This worked perfectly in the python version, but that only had to run on about 300 guilds. I don't have the code for it anymore, but it worked the same way (all the requests were sent at once, probably about 300 requests in 500ms and this worked fine), so I don't think this is a rate limit issue.
client.on('messageCreate', (message) => {
if (message.channelId === '906659272744665139') {
console.log('found message');
const url = message.content;
var webhooklist = [];
//get all the webhooks from the database
db.prepare('SELECT * FROM webhooks').all().forEach(webhook => {
const webhookclient = new WebhookClient({id: webhook.webhookID, token: webhook.webhookToken});
webhooklist.push(webhookclient);
})
console.time('sent');
var failed = 0;
webhooklist.forEach(webhook => {
var row = new MessageActionRow()
.addComponents(
savebutton,
reportbutton
)
webhook.send({content: url, components: [row]})
.catch(err => {
if (err instanceof DiscordAPIError) {
if (err.code === 10015) {
//remove the webhook from the database
db.prepare('DELETE FROM webhooks WHERE webhookID = ?').run(webhook.id);
console.log(`Removed webhook ${webhook.id}`);
} else {
failed += 1;
console.log(err);
}
} else {
failed += 1;
}
});
});
console.timeEnd('sent');
console.log(failed);
});
}
});
The problem:
A request is sent to each webhook, however, the messages don't actually get sent half the time. For example, I'm looking at 3 different servers that this bot is in, and in one of them a message appeared and in the other two it didn't (any other combination of these outcomes also occurs, its not a problem with how the servers are set up). There are also no errors, indicated by the failed variable. To clarify, about 50% of the messages get through, but the other 50% are supposedly sent by the bot but never appear in the channel.
Things that are (most likely) NOT the issue:
-Discord Rate limits
Why: Sending messages through webhooks does not count against the bot's sent messages, and therefore will not cause rate limits (the global 50 messages per second). The only way this would cause me to be rate limited is if I was exceeding the 5/5 rate limit for each webhook individually (this is the same as what happens when you try to spam one channel).
-API rate limits
Why: API rate limits would only trip if I sent more than 10,000 invalid requests in 10 minutes. All of the requests go through, and there are no errors in the console. If I was being API rate limited, I would get completely blocked from using the discord API for up to an hour.
-I'm setting off some sort of spam protection
I've already considered this. Since most of the messages get through, I don't think this is the problem. If I did set off any filters regarding volume of requests, the requests would probably either time out, or I would get blocked.
Other notes:
-A delay between requests is not a viable solution because pretty much any amount of delay multiplied by the 1500 times this has to run would result in this function taking multiple minutes to run.
-This could be related to another issue I'm having, where buttons take a long time to respond to so the interactions will often time out before I can even run .deferReply(), however, none of the webhook requests time out (which suggests that this may not be the issue)
-my internet has seemed slow recently even though I have gigabit, but again, if internet was the issue there would be errors.
In conclusion, a large volume of webhook messages like this SHOULD work, so this issue is most likely client-side.
Found the issue: The large volume of webhook messages sent in such a short amount of time slowed my internet down to a crawl so many of the web requests ended up not going through. The way I wrote this function, those errors didn't get logged correctly. I solved this issue by using await instead of .then().
Related
I'm making a multiplayer game and the way I am making player movement is whenever you click a key, that key is sent to all the other users in your lobby. They then change your position according to what key you clicked. For example if I clicked the 'W' key then it would be sent to everyone and they would all move my character forward. The reason I'm doing it this way is to save bandwidth and try to eliminate a lot of lag. However, this causes 2 problems. One of them being that the clients don't receive that keycode at the same time. Whenever I call Date.now() on JavaScript when I receive that key it is about 1 off from other clients. This will cause about a 3 pixel gap in between where it's supposed to be. I've already implemented Delta time so it looks the same on all framerates. The second problem is that I am highly trying to avoid hosting player positions on server unless necessary. This is a problem because if the players need to get the same position, the server can't give clients that data. To fix this, I made it so when I need a new position update, whether it's because I wasn't on the tab and missed a key or because my positioning is off, it would ask another client for their existing players positions. This solution only works when at least one client, excluding the person who asked, is on the tab.
I've tried using a setInterval to continuously change the players positions and match them up, but that made the players clip all over the place. I've also tried hosting the player positions on the server but it lags a lot and it won't be good if I have 1000 people on a server.
For client side I use p5js
This is the code that sends the server the key I clicked, whenever I click a key:
function keyPressed() {
if (gameStarted) {
if (keyCode === 122) {
return false;
}
if (currentKey.key != null) {
//if (currentKey.on == false || currentKey.key != keyCode) {
currentKey.key = keyCode;
currentKey.on = true;
socket.emit('newKeyCode',{key:currentKey.key,on:true});
//}
}
else {
currentKey.key = keyCode;
currentKey.on = true;
socket.emit('newKeyCode',{key:currentKey.key,on:currentKey.on});
}
}
}
function keyReleased() {
if (gameStarted) {
if (currentKey.key != null) {
currentKey.key = keyCode;
currentKey.on = false;
socket.emit('newKeyCode',{key:currentKey.key,on:currentKey.on});
}
}
}
This is the code that is on the server for whenever I tell the server I pressed a new key:
socket.on('newKeyCode',function(data) {
var lobby = LOBBY_DATA[PLAYER_LIST[socket.id].lobby];
if (lobby != null && lobby != undefined) {
//console.log(data);
for (var i in lobby.players) {
SOCKET_LIST[lobby.players[i].id].emit('newKeyFromClient', {id:socket.id,name:PLAYER_LIST[socket.id].user,key:data.key,on:data.on});
}
}
});
And this is the client code that is ran when I get a new key from the server:
socket.on('newKeyFromClient',function(data) {
socket.emit('receivedKey');
console.log(Date.now());
if (gameStarted) {
changePlayerDirections(data.key, data.on, data.id);
}
});
My goal is to make it so the clients both have the exact same player positioning, you can see for yourself at My game. Once you register and login, click on the Play button in the top left, then duplicate 3 more tabs so you have a total of 4, create a new lobby on one of them and split your computer screen into 4 windows so you can see all of them at once, then join your lobbies on all the windows. Once there are 4 people in your lobby, click Start Game on your host window. This will put you into the game where you can move around with WASD and you'll see it move on the other clients screens. However, it's gonna be a bit off and that's the problem. I want it to be dead on and exact on every single screen.
EDIT:
Sorry if there are rapid changes in the server, I'm working on it right now.
I guess you have already read articles on multiplayer: using UDP or TCP, frame rates for sending receiving datas, simulating game on server and syncing, etc. I will just write my experience.
I made a multiplayer game exactly similar way (using node.js and socket.io sending key events). Testing on LAN was all smooth. Then I tested it on wifi and it was a disaster. Then I send position, and they started jumping around. Bigger problem was it got way random and slower to update.
I read few articles on multiplayer and decided to use UDP to update position. It got fast and on wifi it was negligibly jumpy. Now deploying on server, port started changing. I learned about NAT traversals and stuffs, but it was too vast to implement myself. One thing I found was, at least the port from server was constant : '3000'.
Then I revert back to TCP and tried simulating game on server and sync with client. I send key events and on third frame synced position from server to client. Not much better and I still had to sync game logic. Thats where I stopped.
There is a logic I want to try but have been lazing around. Send key event to server using UDP, simulate on server and sync using TCP but this time using timestamp to determine which value to prevail (server or client). And reducing the data to send, using array of integer only.
I'm doing a 2d game where connected people walk the map, so I need a quick update of where people will be.
every 17 milliseconds (17 because is 60 fps (1000/60)) I send a multidimensional array where all the characters on the map are.
The problem is that, after a few seconds that 1 person connects to the server, it (the server) simply stops sending console.log that I programmed and does nothing else, no warning appears and does not respond if you try to access it directly over the internet.
Is it an overload that it suffers? I tried to increase the delay to 500 milliseconds and even so when 2 people entered it already crash.
If it's really an overhead, what would I have to do to make my game work?
Observation: The character's move system, the client asks the server to change the character's position in the array according to the direction of the arrow.
code:
Server.js:
let jogadoresSala = [[]];
io.on('connection', socket=>{
//codigo que é executado quando uma pessoa conecta
socketIds.push(socket.id);
pessoasConectadas++;
console.log("nova conexão : " + socket.id);
socket.emit('voceEstaConectado', pessoasConectadas);
socket.join('jogadores');
if(pessoasConectadas == 1){
jogadoresSala[0] = [Math.floor(Math.random() * 300),
Math.floor(Math.random() * 300), socket.id];
setInterval(function () {
console.log('sending..');
io.in('jogadores').emit('sincronizacao', jogadoresSala);
}, 17); //where is the loop
} else {
jogadoresSala.push([Math.floor(Math.random() * 300),
Math.floor(Math.random() * 300), socket.id]);
}
}
cliente.js
(work for seconds, if the client does not move, it can reach within minutes);
socket.on('sincronizacao', posicoesPersonagens => {
console.log(posicoesPersonagens);
var key = [];
for (let i = 0; i < posicoesPersonagens.length; i++) {
personagem.setarPosicao(posicoesPersonagens[i][0], posicoesPersonagens[i][1]);
}
})
picture of the game:
One problem is that you have this code:
setInterval(function () {
console.log('sending..');
io.in('jogadores').emit('sincronizacao', jogadoresSala);
}, 17); //where is the loop
inside of:
io.on('connection', ...)
That means that you start a new interval that is broadcasting to all jogadores every 17ms every time the first user connects. I know you're trying to test for doing this only on the first user, but if you go 1 user, 0 user, 1 user, 0 user, then you will start this interval multiple times.
So, for starters, you need to move the setInterval() outside the io.on('connection', ...) callback. Just put it at the top level once your server is started. If there are no clients connected, then it won't have anything to do because there will be no connections in the jogadores room, so no problem.
Then, 17ms (60fps) is probably too fast for a real world scenario over the internet with lots of clients. What speed will work will depend upon your server configuration, your ISP and how many clients you expect to support with one server and you will ultimately have to understand your limits with testing at scale on your actual hardware. Real-time client-server multi-user gaming is not a trivial endeavor. There are lots of tricks that real-time multi-user gaming systems use to try to prevent lag and have timely updates. Going into the details of all the things that can be done is beyond the scope of an answer here. But, suffice it to say that you don't just power your way through it by trying to send 60fps updates to the client. That doesn't really work. You will need to make your user experience work at a far slower update rate.
So you understand what I'm trying to do, I've written a web page which shows events logged in MySQL as soon as they are inserted into the database (basically monitoring Windows & Mac logon/logoff on the network). Every time a new event is inserted the PHP script connects to a web socket and sends a message to all connected browsers to notify them of the new event. When the browsers receive the notification message they run jQuery.get("liveeventsearch.php", ...); to fetch the new event from the database (see javascript code below).
In a nutshell, when a web socket message is received fetch the new record from the database, append the result to a table and play a notification sound.
socket.onmessage = function(msg) {
if (msg.data == "#all new event") {
var newLastUpdate = new Date().toISOString().slice(0, 19).replace('T', ' ');
jQuery.get("liveeventsearch.php", <?php echo '{ Key:"' . $Value . '", etc... , lastupdate:lastUpdate }'; ?>, function(result) {
jQuery('#LiveResults tbody').append(result);
if (jQuery('#chkNotification-Audio').is(':checked') && result > "")
{
jQuery("#Notification-Audio").trigger("play");
}
lastUpdate = newLastUpdate;
});
}
};
My concern is that there are currently approx 1200 devices on the network and it is expected that most, if not all of them will logon/logoff within a 5 to 10 minute period in large clumps hourly with a few additional scattered here and there. So the browser (depending on the supplied search criteria) will likely receive a large number of web socket messages in a small period of time, if not simultaneously (and obviously fetch liveeventsearch.php that many times). Is this likely to cause a problem for the browser fetching results so frequently?
I can provide the code in liveeventsearch.php if necessary.
Alternate Methods
I had thought about adding something like this in the socket.onmessage function to reduce the frequency.
//[PSEUDO CODE]
if (currentTime > lastUpdate + 3 seconds)
{
jQuery.get(...);
}
But then the last set of events will not appear until another web socket message is received which could be a lot longer than 3 seconds. I could possibly use a timer instead, but that kind of defeats the object of having a web socket providing 'live' updates.
Another option I thought of is to create a new MySQL table (e.g. liveUpdates) which contains only an ID field. Then run a cron job every X seconds which inserts a new ID in that table (or run a a script on the server with a continuous loop doing the same thing?). My events table could then have an additional field tying each event to the latest liveUpdates.ID and the cron job could send the web socket message each time a new update ID was created instead of every time an event is logged. But this again would have the same effect as using a timer.
Here is the code:
var process = require('process')
var c = 0;
while (true) {
var t = process.hrtime();
console.log(++c);
}
Here is my environment:
nodejs v4.2.4, Ubuntu 14.04 LTS on Oracle VM virtualbox v5.0.4 r102546 running in Windows 7
This loop can only run about 60k to 80k times before it hangs. Nothing happens after that.
In my colleague's computer maybe 40k to 60k times. But shouldn't this loop continues forever?
I was first running a benchmark which tests avg execution time of setting up connections, so I can't just get the start time at first then end time after everything finished.
Is this related to the OS that I use?
Thanks if anyone knows the problem.
==========================================================
update 2016.4.13:
One day right after I raised this question, I realized what a stupid question it was. And it was not what I really want to do. So I'm gonna explain it further.
Here is the testing structure:
I have a node server which handles connections.Client will send a 'setup' event on 'connect' event. A Redis subscribe channel will be made at server side and then make some queries from db, then call client's callback of 'setup' event. Client disconnect socket in 'setup' callback, and reconnect on 'disconnect' event.
The client codes use socket.io-client to run in backend and cluster to simulate high concurrency.
Codes are like these:
(some of the functions are not listed here)
[server]
socket.on('setup', function(data, callback) {
queryFromDB();
subscribeToRedis();
callback();
}
[client]
var requests = 1000;
if (cluster.isMaster) {
for (var i = 0; i < 100; ++i) {
cluster.fork();
}
} else {
var count = 0;
var startTime = process.hrtime();
socket = io.connect(...);
socket.on('connect', function() {
socket.emit('setup', {arg1:'...', arg2:'...'}, function() {
var setupEndTime = process.hrtime();
calculateSetupTime(startTime, setupEndTime);
socket.disconnect();
}
}
socket.on('disconnect', function() {
if (count++ < requests) {
var disconnectEndTime = process.hrtime();
calculateSetupTime(startTime, disconnectEndTime);
socket.connect();
} else {
process.exit();
}
}
}
At first the connections could only make 500 or 600 times. Somehow I removed all the hrtime() codes, it made it to 1000 times. But later I raised the number of requests to like 2000 times (without hrtime() codes), it could not finish again.
I was totally confused. Yesterday I thought it was related to hrtime, but of course it wasn't, any infinite loop would hang. I was misled by hrtime.
But what's the problem now?
===================================================================
update 2016.4.19
I solved this problem.
The reason is my client codes use socket.disconnect and socket.connect to simulate a new user. This is wrong.
In this case server may not recognize the old socket disconnected. You have to delete your socket object and new another one.
So you may find the connection count does not equal to disconnection count, and this will prevent our code from disconnecting to redis, thus the whole loop hang because of redis not responsing.
Your code is an infinite loop - at some point this will always exhaust system resources and cause your application to hang.
Other than causing your application to hang, the code you have posted does very little else. Essentially, it could be described like this:
For the rest of eternity, or until my app hangs, (whichever happens first):
Get the current high-resolution real time, and then ignore it without doing anything with it.
Increment a number and log it
Repeat as quickly as possible
If this is really what you wanted to do - you have acheived it, but it will always hang at some point. Otherwise, you may want to explain your desired result further.
I'm using PHP over IIS 7.5 on Windows Server 2008.
My web application is requesting repeatedly with Ajax in the background 3 different JSON pages:
page 1 Every 6 seconds
page 2 Every 30 seconds
page 3 Every 60 seconds
They retrieve data related with the current state of some tables. This way I keep the view updated.
Usually I have no much trouble with it, but lately I saw my server saturated with hundreds of unanswered requests and I believe the problem can be due to a delay in one of the request.
If page1, which is being requested every 6 seconds, needs 45 seconds to respond (due to slow database queries or whatever), then it seem to me that the requests start getting piled one after the other.
If I have multiple users connected to the web application at the same time (or with multiple tabs) things can turn bad.
Any suggestion about how to avoid this kind of problem?
I was thinking about using some thing such as ZMQ together with Sockets.io in the client side, but as the data I'm requesting doesn't get fired from any user action, I don't see how this could be triggered from the server side.
I was thinking about using some thing such as ZMQ together with Sockets.io in the client side...
This is almost definitely the best option for long-running requests.
...but as the data I'm requesting doesn't get fired from any user action, I don't see how this could be triggered from the server side.
In this case, the 'user action' in question is connecting to the socket.io server. This cut-down example is taken from one of the socket.io getting started docs:
var io = require('socket.io')(http);
io.on('connection', function(socket) {
console.log('a user connected');
});
When the 'connection' event is fired, you could start listening for messages on your ZMQ message queue. If necessary, you could also start the long-running queries.
I ended up solving the problem following the recommendation of #epascarello and improving it a bit if I get no response in X time.
If the request has not come back, do not send another. But fix the serverside code and speed it up.
Basically I did something like the following:
var ELAPSED_TIME_LIMIT = 5; //5 minutes
var responseAnswered = true;
var prevTime = new Date().getTime();
setInterval(function(){
//if it was answered or more than X m inutes passed since the last call
if(responseAnsswered && elapsedTime() > ELAPSED_TIME_LIMIT){
getData()
updateElapsedTime();
}
}, 6000);
function getData(){
responseAnswered = false;
$.post("http://whatever.com/action.json", function(result){
responseAnswered = true
});
}
//Returns the elapsed time since the last time prevTime was update for the given element.
function elapsedTime(){
var curTime = new Date().getTime();
//time difference between the last scroll and the current one
var timeDiff = curTime - prevTime;
//time in minutes
return (timeDiff / 1000) / 60;
}
//updates the prevTime with the current time
function updateElapsedTime(){
prevTime = new Date().getTime();
}
This is a very bad setup. You should always avoid polling if possible. Instead of sending request every 6 seconds from client to server, send data from server to the clients. You should check at the server side if there is any change in the data, then transfer the data to the clients using websockets. You can use nodejs at the server side to monitor any changes in the data.