Some lines of code to give you the idea what I'm trying to ask.
Code starts with
var webSocketsServerPort = 8002;
var webSocketServer = require('websocket').server;
var conns = [];
I use the array conns to push the users after each successful connection. I put there additional (their ID) information so I can identify the user.
And when I need to send a specific information to a user I call the following function.
function sendMessage(userID, message){
for(var i = 0, len = conns.length; i < len; ++i){
if(conns[i].customData.ID == userID){
conns[i].sendUTF(message);
}
}
}
My question is:
Is it a better idea if replace conns[i].sendUTF(message); with setTimeout(function(){conns[i].sendUTF(message)},1) so that in case there are 5000 connected users sendUTF(msg) will not be able to block the loop and in the best case all the messages will be sent at the same time.
If you change your design to order everything by an id instead of an Array of objects, there is no reason to have to loop to find all of the user's connection. You would only need to loop through the multiple connections for each user.
var connections = {};
function addConnection (userId, conn) {
if (!connections[userId]) {
connections[userId] = [];
}
connections[userId].push(conn);
}
var getUserConnections (userId) {
return connections[userId];
}
That wouldn't help in the way you are thinking. If it's not going to "block" at that time, it will "block" in 1 ms.
Doing setTimeout that way will only delay the execution, but not the queueing. JS will still blockingly run your for loop to get all 5000 items into the waiting queue before clearing the stack for the other things.
What you need is to give way each iteration. Since you're on NodeJS, you can use process.nextTick() to schedule the next iteration. Here's a quick example.
var i = 0;
var length = cons.length;
function foo(){
// if not yet the limit, schedule the next
if(i++ < length) process.nextTick(foo);
// Run as usual
if(conns[i].customData.ID == userID) conns[i].sendUTF(message);
}
Related
I am trying to implement a timer in javascript using web workers. I wrote the following code. But it's not working. Can someone point out why it's not working? I don't have much experience with javascript. So, it would be great if someone explains the reason in great detail.
Here's what I'm trying to do:
First creating a sharedArrayBuffer and a worker thread. And creating another array, on which, I will do some work and want to count the time. Then sending the sharedArrayBuffer to worker thread which increments the first value in the array in a for loop. Finally, I am reading that value in the main.js and I'm getting 0 every time.
main.js
var buffer = new SharedArrayBuffer(1024);
var i;
var uint32 = new Uint32Array(buffer);
var myWorker = new Worker('worker.js');
var array = new Uint32Array(8);
array[0] = 0;
console.log(Atomics.load(uint32,0),array[0]);
myWorker.postMessage(buffer);
for(i=0;i<300000000;i++) {
array[0] += i;
}
console.log(i,Atomics.load(uint32,0),array[0]);
worker.js
onmessage = function(buffer) {
console.log('from worker');
var uint32 = new Uint32Array(buffer.data);
for(i=0; ;i++) {
uint32[0] += 1;
};
}
You should not be using code like this to try and determine how long code takes to run. It's non-sensical because incrementing the count in an array is not tied to time or any unit of measurement. Instead, there are APIs which can be used to evaluate performance, such as console.time():
onmessage = function(buffer) {
console.time('TimeSpentInWorker');
// Your code...
console.timeEnd('TimeSpentInWorker');
};
You could also compare the difference between calling Date.now() twice or look into the Performance API.
I'd like to use Node.JS to design some bots. Here are the requirements of these bots:
There are upto 10 'bots'. Im not sure how to do this in NodeJS,
considering its single threaded, Im assuming if there are 10 worker
items that are asynchronous, that will be representative of 10
'bots'?
Bots perform a basic REST task, like a POST to a remote server. Assume every POST is a success (or assume we dont care if there is a failure). While the remote server is the same and the function is the same (POST), there may be variants in arguments, and each bot will supply the variable arguments, like payload to POST
Bots should somewhat model human behavior by randomly sleeping for some k seconds before firing off a task. Then it queues itself for another random k seconds before performing another task. This adds a level of complexity that Im unable to wrap my head around - if I use a sleep/timeout function like setTimeout or setInterval will 10 of such workers sleep in parallel or sleep serially. If they sleep serially then I dont have 10 bots, instead I have 10 serial workers queued in order of sleep!
What I have tried so far:
Since Im new to NodeJS, i havent been able to accurately find the right way to deal with this.
I looked into beanstalkd which is a workqueue, but it appears serial that the consumer service will order items serially
Im currently evaluating async.parallel but it appears that the parallelism is a 'barrier' where all parallel jobs will proceed to the next iteration only after all parallel jobs have finished the function to be executed in parallel, but in my case I'd like, for example, bot 3 to be requeued and sleeping on its 2nd iteration even though bot 7 is on a long sleep in its first iteration
The asynchronous nature of javascript means that when each 'bot' is sleeping it doesn't block and cause the other bots to sleep. For example, in this code:
'use strict';
var start = Date.now();
var printTime = function() {
console.log(Date.now() - start + 'ms');
};
setTimeout(function() {
printTime();
}, 500);
setTimeout(function() {
printTime();
}, 1000);
Should print (approximately):
500ms
1000ms
Rather than:
500ms
1500ms
Something like this should work just fine:
'use strict';
var fetch = require('node-fetch');
// Each bot waits between 5 and 30 seconds
var minDelay = 5000;
var maxDelay = 30000;
var numBots = 10;
var botTask = function() {
fetch('http://somewhere.com/foo', { method: 'POST', body: 'a=1' });
};
var getDelay = function() {
return minDelay + Math.random() * (maxDelay - minDelay);
};
var runBot = function() {
setTimeout(function() {
botTask();
runBot();
}, getDelay());
};
for (var i = 0; i !== numBots; i++) {
runBot();
}
Here is a very simple framework:
var bot = {
act: function() {
//make post request here
var delay = Math.random() * 500; /*set random delay to mimic human */
setTimeout(this.act.bind(this), delay);
}
}
var bots = [];
for (var i = 0; i < 10; i++) {
bots.push(Object.create(bot));
}
bots.forEach(function(bot) {
bot.act();
});
We have a master bot template, the bot variable. bot.act is a function that will send the POST request, then set a timeout on itself after a delay. The rest is just boilerplate, adding 10 bots to a list, and starting each bot. It's really that simple... no work queues, no async parallel...
i am having trouble saving a new record to mongoDB. i am pretty sure there is something i am using in my code that i don't fully understand and i was hoping someone might be able to help.
i am trying to save a new record to mongoDB for each of the cats. this code is for node.js
for(var x = 0; x < (cats.length - 1); x++){
if (!blocked){
console.log("x = "+x);
var memberMessage = new Message();
memberMessage.message = message.message;
memberMessage.recipient = room[x].userId;
memberMessage.save(function(err){
if (err) console.log(err);
console.log(memberMessage + " saved for "+cats[x].name);
});
}
});
}
i log the value of "cats" before the loop and i do get all the names i expect so i would think that looping through the array it would store a new record for each loop.
what seems to happen is that when i look ta the the database, it seems to have only saved for the last record for every loop cycle. i don't know how/why it would be doing that.
any help on this is appreciated because I'm new to node.js and mongoDB.
thanks.
That's because the save is actually a I/O operation which is Async. Now, the for loop is actually sync.
Think of it this way: your JS engine serially executes each line it sees. Assume these lines are kept one-after-another on a stack. When it comes to the save, it keeps it aside on a different stack (as it is an I/O operation, and thus would take time) and goes ahead with the rest of the loop. It so turns out that the engine would only check this new stack after it has completed every line on the older one. Therefore, the value of the variable cats will be the last item in the array. Thus, only the last value is saved.
To fight this tragedy, you can use mutiple methods:
Closures - Read More
You can make closure like so: cats.forEach()
Promises - Read More. There is a sweet library which promisifies the mongo driver to make it easier to work with.
Generators, etc. - Read More. Not ready for primetime yet.
Note about #2 - I'm not a contributor of the project, but do work with the author. I've been using the library for well over an year now, and it's fast and awesome!
You can use a batch create feature from mongoose:
var messages = [];
for(var x = 0; x < (cats.length - 1); x++) {
if (!blocked) {
var message = new Message();
message.message = message.message;
message.recipient = room[x].userId;
messages.push(message);
}
}
Message.create(messages, function (err) {
if (err) // ...
});
So I've been stuck on this for quite a while. I asked a similar question here: How exactly does done() work and how can I loop executions inside done()?
but I guess my problem has changed a bit.
So the thing is, I'm loading a lot of streams and it's taking a while to process them all. So to make up for that, I want to at least load the streams that have already been processed onto my webpage, and continue processing stream of tweets at the same time.
loadTweets: function(username) {
$.ajax({
url: '/api/1.0/tweetsForUsername.php?username=' + username
}).done(function (data) {
var json = jQuery.parseJSON(data);
var jsonTweets = json['tweets'];
$.Mustache.load('/mustaches.php', function() {
for (var i = 0; i < jsonTweets.length; i++) {
var tweet = jsonTweets[i];
var optional_id = '_user_tweets';
$('#all-user-tweets').mustache('tweets_tweet', { tweet: tweet, optional_id: optional_id });
configureTweetSentiment(tweet);
configureTweetView(tweet);
}
});
});
}};
}
This is pretty much the structure to my code right now. I guess the problem is the for loop, because nothing will display until the for loop is done. So I have two questions.
How can I get the stream of tweets to display on my website as they're processed?
How can I make sure the Mustache.load() is only executed once while doing this?
The problem is that the UI manipulation and JS operations all run in the same thread. So to solve this problem you should just use a setTimeout function so that the JS operations are queued at the end of all UI operations. You can also pass a parameter for the timeinterval (around 4 ms) so that browsers with a slower JS engine can also perform smoothly.
...
var i = 0;
var timer = setInterval(function() {
var tweet = jsonTweets[i++];
var optional_id = '_user_tweets';
$('#all-user-tweets').mustache('tweets_tweet', {
tweet: tweet,
optional_id: optional_id
});
configureTweetSentiment(tweet);
configureTweetView(tweet);
if(i === jsonTweets.length){
clearInterval(timer);
}
}, 4); //Interval between loading tweets
...
NOTE
The solution is based on the following assumptions -
You are manipulating the dom with the configureTweetSentiment and the configureTweetView methods.
Ideally the solution provided above would not be the best solution. Instead you should create all html elements first in javascript only and at the end append the final html string to a div. You would see a drastic change in performance (Seriously!)
You don't want to use web workers because they are not supported in old browsers. If that's not the case and you are not manipulating the dom with the configure methods then web workers are the way to go for data intensive operations.
I've literally been trying all day to make Firefox to obey my will...
I want :
int c = SELECT COUNT(*) FROM ...
I've tried executeAsync({...});, but I believe it's the wrong paradigm, as I want the result immediately. (And mozIStoragePendingStatement results in errors)
var count = 0;
var conn = Services.storage.openDatabase(dbfile); // Will also create the file if it does not exist
let statement = conn.createStatement("SELECT COUNT(*) FROM edges LIMIT 42;");
console.log("columns: " + statement.columnCount); // prints "1";
console.log("col name:" + statement.getColumnName(0)); // is "COUNT(*)"
while (statement.executeStep())
count = statement.row.getResultByIndex(0); // "illegal value"
count = statement.row.getString(0); // "illegal value", too
count = statement.row.COUNT(*); // hahaha. still not working
count = statement.row[0]; // hahaha. "undefinded"
count = statement.row[1]; // hahaha. "undefinded"
}
statement.reset();
It basically works but I dont get the value. What's wrong with all the statements (those within the loop).
Thanks for any hints...
I've tried executeAsync({...});, but I believe it's the wrong paradigm, as I want the result immediately.
You shouldn't want that, the Storage API is asynchronous for a reason. Synchronous access to databases can cause a random delay (e.g. if the hard drive is busy). And since your code executes on the main thread (the same thread that services the user interface) the entire user interface would hang while your code is waiting for the database to respond. The Mozilla devs tried synchronous database access in Firefox 3 and quickly noticed that it degrades user experience - hence the asynchronous API, the database processing happens on a background thread without blocking anything.
You should change your code to work asynchronously. Something like this should do for example:
Components.utils.import("resource://gre/modules/Services.jsm");
var conn = Services.storage.openDatabase(dbfile);
if (conn.schemaVersion < 1)
{
conn.createTable("edges", "s INTEGER, t INTEGER");
conn.schemaVersion = 1;
}
var statement = conn.createStatement("SELECT COUNT(*) FROM edges");
statement.executeAsync({
handleResult: function(resultSet)
{
var row = resultSet.getNextRow();
var count = row.getResultByIndex(0);
processResult(count);
},
handleError: function(error) {},
handleCompletion: function(reason) {}
});
// Close connection once the pending operations are completed
conn.asyncClose();
See also: mozIStorageResultSet, mozIStorageRow.
Try aliasing count(*) as total, then fetch that