Variable not updating properly - javascript

This is the relevant code :
This sends a request to the server, takes an userName as an argument.
function send_CLIENT_LOOKUP(userName)
{
send(
auth.AOCP.CLIENT_LOOKUP,
[ [ 'S', userName.toString() ]
]
)
}
This part handles the response from the server.
handle[auth.AOCP.CLIENT_LOOKUP] = function (data, u)
{
console.log('CLIENT_LOOKUP')
var userId = u.I()
var userName = u.S()
u.done()
console.log({ userId : userId, userName : userName }) // this works properly
var idResult = userId; // I can actually use the userID but I just assigned it to idResult to make the code easier to understand.
}
The above 2 functions work how they are supposed to, nothing to change/fix there.
Now I have a function that receives a request from one user and sends a request to another user, it takes 2 arguments: first arg is the userId of the user that sends the request and the second is the userName of user the request will be sent to however the server/game only works with userIds so the userName has to be converted:
var commands = {
invite: function(userId, userName) {
send_CLIENT_LOOKUP(userName); // send a request to the server to find out the userId of the userName
send_PRIVGRP_INVITE(idResult);
}
}
The problem is that idResult == undefined, unless I call cmd.invite() again then idResult == 'with the previous response like this:
cmd.invite(user1);
cmd.invite(user2);
cmd.invite(user3);
And the output of idResult is:
idResult == undefined
idResult == 'info for user1'
idResult == 'info for user2'
I tried defining idResult outside response handler and update it inside, to make sure its not some sort of delay from the server I just did a massive invite spam and the result was the same, 1 step behind no matter how fast I sent the invites. Any suggestions are welcome :)

The problem is rather hard to solve, as sending and collecting replies are isolated. It seems that send() sends a packet and handle() receives a reply, and there can be many possible unrelated packets between sending and receiving.
If it's the case then a special global map should be created to link requests and responses. For cases when there are 2 concurrent requests for the same username, EventEmitter from events module is a good fit, as it's essentially a multimap of strings to callbacks:
var events = require('events')
var outstandingLookups = new events.EventEmitter()
So when you send a lookup you register a one time listener, so there's no need for manual cleanup:
var commands = {
invite: function(userName) {
outstandingLookups.once(userName, function (idResult)
{
send_PRIVGRP_INVITE(idResult);
})
// send a request to the server to find out the userId of the userName
send_CLIENT_LOOKUP(userName);
}
}
In response handler emit conveniently calls for you all the callbacks expecting userId for the received userName:
handle[auth.AOCP.CLIENT_LOOKUP] = function (data, u)
{
console.log('CLIENT_LOOKUP')
var userId = u.I()
var userName = u.S()
u.done()
console.log({ userId : userId, userName : userName }) // this works properly
// I can actually use the userID but I just assigned it to idResult
// to make the code easier to understand.
var idResult = userId;
// this is the only addition to original handler code
outstandingLookups.emit(userName, idResult)
}
The need to use outstandingLookups.once is actually an implementation detail and it should be properly encapsulated/hidden if you want to lookup userIds in more than one place in the code.
I will use promises from q npm library as I remember the interface, but similar ES6 promises should be used in production code as they are a modern standard.
var Q = require('q')
var commands = {
lookupUserName: function (userName)
{
var result = Q.defer()
outstandingLookups.once(userName, function (idResult)
{
result.resolve(idResult);
})
// send a request to the server to find out the userId of the userName
send_CLIENT_LOOKUP(userName);
return result.promise
},
invite: function(userName) {
commands.lookupUserName(userName).then(function (idResult)
{
send_PRIVGRP_INVITE(idResult);
})
}
}
The code for commands.invite is much cleaner now.

It looks like you're doing an async-request but updating the var idResult synchronously.
So you should put the call to send_PRIVGRP_INVITE(idResult) into the callback or response handler of send_CLIENT_LOOKUP(userName)

Related

Handling database objects in node js

In general how does javascript interpret a Database {} object? I am writing some back end scripts to handle a registration form verification. In particular I need to ensure that the username and email used to register is not currently in use. To do this I use the sqlite3 package and use a few db.get calls to determine if there are existing entries in my database for the username and email used on the registration form. I want to use the return of db.get to check if it is empty or not and use this conditional to perform the necessary task. However the db.get returns a Database {} object which I am unfamiliar of how to work with.
Hopefully the following pseudo describes the issue better. Here uname returns Database {} and so never fails the if statement.
function existance(username, email) {
let uname = db.get(sql, username, callback(err, throw));
if (uname) {
let errors = {username: 'Username already in use.'};
return errors;
}
};
EDIT
I have since used Gajiu's recommendation but still having issues. So I have two files:
registrant_existence.js
// necessary requirements and initialisation
function registrant_existence(username) {
let uname;
let sql = 'SELECT username FROM table WHERE username=?';
db.get(sql, username, function(err, row) {
if (err) {
throw err;
} else {
uname = row;
console.log(uname);
}
});
console.log(uname);
if (uname !== undefined) {
return {username: 'Username already in use.'};
} else {
return 'DNE';
}
};
module.exports = registrant_existence;
register.js
let registrant_existence = require("path to registrant_existence.js");
// necessary requirements and initialisation
router.post('/', function(req, res) {
let existence = registrant_existence(req.body.username, req.body.email);
if (existence != 'DNE') {
// render registration page notifying the user
// that the username is already in use
} else {
// register the new user details
}
});
The uname variable is undefined always. I placed the console.log(uname) in two spots in registrant_existence.js as is seen above to see what is happening. Two strange things occur.
The first is that the console.log(uname) outside the db.get() displays undefined in the console and the console.log(uname) inside the db.get() displays the expected string (a username I know is in my database).
The second is that the console.log(uname) outside the db.get() is displayed before the console.log(uname) _inside the db.get() in my console.
I have no idea why these things are happening. Does anyone have any suggestions?
You should try something like that:
db.get(sql, username, (err, data) => {
// process the data here
if (err) {
return console.error(err.message);
}
return data
? console.log(data.id, data.userName)
: console.log('No username found');
});
I guess you are looking for a wrapper around your Database object, something like an Object Relational Mapper (ORM), this one is used regularly https://sequelize.org/master/manual/getting-started.html
On the other hand for your specific use case you might want to get a look at this https://www.sqlite.org/lang_createtable.html#unique_constraints: unicity is usually via constraints in the data store.

Async functions in meteor server

I'm using vpulim:node-soap to have a soap server running.
My meteor server startup contains this amongst various other code:
authRequestOperation: function(args,cb,headers,req) {
console.log(args);
var authResponceObject = {};
var futureAuthResponse = new Future();
Fiber(function(){
if(collectorUsers.findOne({username: args.username})){
console.log("Found User");
authResponceObject = {
username: args.username,
nonce: Random.id()
};
console.log("authResponceObject is: " + JSON.stringify(authResponceObject,null,4));
console.log("futureAuthResponse returning...");
futureAuthResponse.return(authResponceObject);
}
// console.log("futureAuthResponse waiting...");
// return futureAuthResponse.wait();
}).run();
console.log("authResponceObject after fiber is: " + JSON.stringify(authResponceObject,null,4));
return authResponceObject;
},
What I'm trying to do is:
I receive a user object from client.
I check if the user is present in the mongodb
If user is present, prepare response object
Respond to the client with the response object.
I have 1. working. However, the dute it being async call, the order of 2,3,4 is messed up.
Right now what's happening is:
receive client object
return response object (which is empty)
Check mongo
Prepare response object.
I'm not using Meteor.methods for the above.
How do I make this work in the right manner? I've tried juggling around wrapAsync and fiber/future but hitting dead ends.
I believe Meteor.bindEnvironment can solve your problem, try this code:
{
// ...
authRequestOperation: Meteor.bindEnvironment(function(args, cb, headers, req) {
console.log(args);
var authResponceObject = {};
if (collectorUsers.findOne({username: args.username})) {
console.log("Found User");
authResponceObject = {
username: args.username,
nonce: Random.id()
};
console.log("authResponceObject is: " + JSON.stringify(authResponceObject, null, 4));
}
return authResponceObject;
}),
// ...
}

Node.js & Node-Postgres: Putting Queries into Models

I would like to 'functionalize' my queries by putting them into functions which have apt names for the task.
I want to avoid putting everything in the req, res functions (my controllers), and instead put them in 'models' of sorts, that is, another JavaScript file that will be imported and used to run the functions that execute queries and return the results on behalf of the controller.
Assuming that I have the following setup for the queries:
UserController.js
exports.userAccount = function(req, res, next) {
var queryText = "\
SELECT *\
FROM users\
WHERE id = $1\
";
var queryValues = [168];
pg.connect(secrets.DATABASE_URL, function(err, client, done) {
client.query(queryText, queryValues, function(err, result) {
res.render('pathToSome/page', {
queryResult: result.rows
});
});
});
}
Here, while I'm in the query, I essentially redirect and render a page with the data. That works fine. But I want to take out all that pg.connect and client.query code and move it to a separate file to be imported as a model. I've come up with the following:
UserModel.js
exports.findUser = function(id) {
// The user to be returned from the query
// Local scope to 'findUser' function?
var user = {};
var queryText = "\
SELECT *\
FROM users\
WHERE id = $1\
";
var queryValues = [id];
pg.connect(secrets.DATABASE_URL, function(err, client, done) {
client.query(queryText, queryValues, function(err, result) {
// There is only ever 1 row returned, so get the first one in the array
// Apparently this is local scope to 'client.query'?
// I want this to overwrite the user variable declared at the top of the function
user = result.rows;
// Console output correct; I have my one user
console.log("User data: " + JSON.stringify(user));
});
});
// I expect this to be correct. User is empty, because it was not really
// assigned in the user = result.rows call above.
console.log("User outside of 'pg.connect': " + JSON.stringify(user));
// I would like to return the user here, but it's empty!
return user;
};
and I'm calling my model function as so:
var user = UserModel.findUser(req.user.id);
The query executes perfectly fine in this fashion - except that the user object is not being assigned correctly (I'm assuming a scope issue), and I can't figure it out.
The goal is to be able to call a function (like the one above) from the controller, have the model execute the query and return the result to the controller.
Am I missing something blatantly obvious here?
pgconnect is an asynchronous call. Instead of waiting for data to return from the database before proceeding with the next line, it goes ahead with the rest of the program before Postgres answers. So in the code above, findUser returns a variable that has not yet been populated.
In order to make it work correctly, you have to add a callback to the findUser function. (I told you wrong in a previous edit: The done parameter in pg.connect is called in order to release the connection back to the connection pool.) The final result should look something like this:
exports.findUser = function(id, callback) {
var user = {};
var queryText = "SELECT FROM users WHERE id = $1";
var queryValues = [id];
pg.connect(secrets.DATABASE_URL, function(err, client, done) {
client.query(queryText, queryValues, function(err, result) {
user = result.rows;
done(); // Releases the connection back to the connection pool
callback(err, user);
});
});
return user;
};
And you'd use it, not like this:
var user = myModule.findUser(id);
But like this:
myModule.findUser(id, function(err, user){
// do something with the user.
});
If you have several steps to perform, each of them dependent on data from a previous asynchronous call, you'll wind up with confusing, Inception-style nested callbacks. Several asynchronous libraries exist to help you with making such code more readable, but the most popular is npm's async module.

How can I use arr.forEach to call async JavaScript redis calls?

I'm working with node.js and redis. I've got a redis database with a bunch of keys. Something like this:
user/chris/potion
user/pete/potion
user/chris/race
user/pete/race
user/chris/weapon
user/pete/weapon
I want to do a redis call which retrieves all user stats, puts the stats into a JS object, then passes it to the client for displaying character stats in the browser. Using javascript I inject the username chris at u into the redis call like this:
KEYS user/u/*
which returns:
1) "user/chris/weapon"
2) "user/chris/race"
3) "user/chris/potion"
Now I can iterate through those results, get the value of each key with GET, and make a javascript object. Seems super simple so I write the code. I quickly run into problems using forEach:
var redis = require('redis');
var client = redis.createClient();
exports.getUserObject = function(requesteduser, callback) {
var userstats = {}; // the object to hold the user stats once retrieved from the db
client.KEYS('user/' + requesteduser + '/*', function(err, replies) {
replies.forEach(function (reply, i) {
client.GET(reply, function(err, value) {
// get the key name so we can populate a js object
var n = reply.lastIndexOf('/');
var key = reply.substring(n + 1, reply.length);
userstats[key] = value;
console.dir(userstats);
callback(null, userstats); // parent expects (err, userstats)
});
});
});
}
When ran, output is like this:
{ weapon: 'twosword' }
{ weapon: 'twosword', race: 'elf' }
{ weapon: 'twosword', race: 'elf', potion: 'mana' }
callback(null, userstats) is called three times. Calling callback more than once will be problematic, since eventually callback will trigger data being sent data to the client.
I think I know what is happening. client.GET is ran three times because I asked it to. The replies array has three elements, so each time the result comes in for client.GET, the callback is called.
What I need to happen is for the forEach loop to finish iterating through all the redis keys, and once that's done, call the callback once. I tried solving this problem first with promises, then with async, because async has async.each. I got stuck solving the problem with both. I'm back to square one now, I'm convinced I have to do something different with forEach to make progress.
How can I accomplish this?
Since you're iterating over replies, you can check when you've reached the last element and only call callback in that instance.
client.KEYS('user/' + requesteduser + '/*', function(err, replies) {
replies.forEach(function (reply, i) {
client.GET(reply, function(err, value) {
// get the key name so we can populate a js object
var n = reply.lastIndexOf('/');
var key = reply.substring(n + 1, reply.length);
userstats[key] = value;
console.dir(userstats);
if (i == replies.length-1) callback(null, userstats); // parent expects (err, userstats)
});
});
});
I know it's a little late to help #Grimtech but I'll leave my opinion for other people that arrives here eventually:
First of all I rather to model #Grimtech's problem differently. Something like this:
client.hmset("user:chris", "weapon", "some weapon", "race", "some race", "potion", "some potion");
client.hmset("user:pete", "weapon", "same weapon", "race", "other race", "potion", "other potion");
Then I would have a method in Node returning a json like the following, assuming that I can have more than one key starting with "user:chris":
router.get('/users/:user_id', function(req, res, next) {
var response = [];
var client = redis.createClient();
client.keys("user:" + req.params.user_id + "*", function (err, users) {
users.forEach(function (key, pos) {
client.hgetall(key, function (err, user) {
response.push(user);
if (pos === users.length - 1) {
res.json(response);
client.quit();
}
});
});
});
});
The if (pos === users.length - 1) solves the async issue.
The json returned will have all the "user:chris" attributes so yo can do whatever you want in the client browser.

CoffeeScript function returning a function and not a value

I have a hash that calls a function to get a value. The problem is, the function is returning the function inside rather than the values it should.
(user is defined above this hash)
My hash:
userInfo = {
id: user.id,
email: user.email,
cars: getCars(user.id),
}
Which calls this function:
getCars = (userId) ->
id = parseInt(userId)
userRef = new Firebase("https://demo-firebase.firebaseIO.com/users/#{id}/")
userRef.on('value', (snapshot) ->
if snapshot.val() == null
["toyota"]
else
snapshot.val().cars # returns an array of cars
)
When I'm in the debugger and stepping through the function, it returns on the userRef.on line rather than the correct place in the if/else statement.
Here's the compiled JS:
getCars = function(userId) {
var id, userRef;
id = parseInt(userId);
userRef = new Firebase("https://demo-firebase.firebaseIO.com/users/" + id + "/");
return userRef.on('value', function(snapshot) {
if (snapshot.val() === null) {
return ["toyota"];
} else {
return snapshot.val().cars;
}
});
};
Any ideas why this is happening? I'm sure it's something simple I'm overlooking.
So the data you are getting from firebase is event-driven and asynchronous, so you can't just return it as if this was synchronous code. You need to use a either a callback, a promise, or event handler.
getCars = (userId, callback) ->
id = parseInt(userId)
userRef = new Firebase("https://demo-firebase.firebaseIO.com/users/#{id}/")
userRef.on 'value', (snapshot) ->
if snapshot.val() == null
callback ["toyota"]
else
callback snapshot.val().cars # returns an array of cars
userInfo =
id: user.id
email: user.email
getCars user.id, (cars) ->
userInfo.cars = cars
#Don't user userInfo until here as it's not ready/populated yet!
(Note the node convention is callback(errorOrNull, value), but I'm omitting error handling here for simplicity)
Also note that almost everyone new to async javascript makes this mistake, but it's not a simple syntax gotcha, it's a fundamental thing you at some point (maybe today) you will have the aha/lightbulb moment. The thing to do is step through this in the chrome debugger and note the order each line of code executes in relationship to time. The line with the if statement executes LATER IN TIME after getCars has already returned. And note if you step through it it will skip right over the body of the 'value' event handler because that line just DEFINES the event handler, but it doesn't actually EXECUTE it until the data arrives, so if you want to debug in that, you need to set a breakpoint on the first line of that function (where the if statement is).
There are 3 common paradigms available for this: event binding, promises, and callbacks. All will work. It would be a good exercise for you to code this same functionality with each paradigm and understand that they all basically give you a way to wait for some data to arrive and then run some code in response to the data arriving.

Categories

Resources