I would like to know what is the best way to handle an evented api. My first thought is wrapping it into a callback api but I have feeling that I am blocking right here and that this is not the best approach.
Given I have a lib that returns values from a db and it is evented:
var connection = new Connection(config);
var query = new Query();
query.on('data',function(row){});
query.on('done',function(){});
This is my take ...
getAllCustomer(clb){
var connection = new Connection(config);
var query = new Query();
var results = [];
query.on('data',function(row){
results.push(row);
});
query.on('done',function(){
clb(results);
});
connection.execute(query);
connection.close();
}
I would like to use that in an hapijs or expressjs app
like to use it like this
handler: function(request,reply){
getAllCustomer(function(err,result){
reply(result);
});
}
getAllCustomers() seems to be blocking until the query is done!
So what would be the recommended way to handle that?
Is the design dessicion right to wrap the evented api in a callback api?
Or should I pass the streaming rows along with expressjs / hapijs?
So I am looking for a best practise way ... the node way...
Thanks for any help.
In the code example that you provided getAllCustomers() will be blocking, only if connection.execute(query) is blocking - and it seems the case, cause otherwise if it was asynchronous - it would have been closed by a subsequent call to connection.close() before it even had a chance to run a query.
If it isn't sync/blocking - than you need to clode sonnection in a done callback
Related
Dear all,
Im working with JS for some weeks and now I need a bit of clarification. I have read a lot of sources and a lot of Q&A also in here and this is what I learned so far.
Everything below is in connection with Node.js and Socket.io
Use of globals in Node.js "can" be done, but is not best practice, terms: DONT DO IT!
With Sockets, everything is treated per socket call, meaning there is hardly a memory of previous call. Call gets in, and gets served, so no "kept" variables.
Ok I build up some chat example, multiple users - all get served with broadcast but no private messages for example.
Fairly simple and fairly ok. But now I am stuck in my mind and cant wrap my head around.
Lets say:
I need to act on the request
Like a request: "To all users whose name is BRIAN"
In my head I imagined:
1.
Custom object USER - defined globally on Node.js
function User(socket) {
this.Name;
this.socket = socket; }
2.
Than hold an ARRAY of these globally
users = [];
and on newConnection, create a new User, pass on its socket and store in the array for further action with
users.push(new User(socket));
3.
And on a Socket.io request that wants to contact all BRIANs do something like
for (var i = 0; i < users.length; i++) {
if(user[i].Name == "BRIAN") {
// Emit to user[i].socket
}}
But after trying and erroring, debugging, googling and reading apparently this is NOT how something like this should be done and somehow I cant find the right way to do it, or at least see / understand it. can you please help me, point me into a good direction or propose a best practice here? That would be awesome :-)
Note:
I dont want to store the data in a DB (that is next step) I want to work on the fly.
Thank you very much for your inputs
Oliver
first of all, please don't put users in a global variable, better put it in a module and require it elsewhere whenever needed. you can do it like this:
users.js
var users = {
_list : {}
};
users.create = function(data){
this._list[data.id] = data;
}
users.get = function(user_id){
return this._list[user_id];
};
users.getAll = function(){
return this._list;
};
module.exports = users;
and somewhere where in your implementation
var users = require('users');
For your problem where you want to send to all users with name "BRIAN",
i can see that you can do this good in 2 ways.
First.
When user is connected to socketio server, let the user join a socketio room using his/her name.
so it will look like this:
var custom_namespace = io.of('/custom_namespace');
custom_namespace.on('connection', function(client_socket){
//assuming here is where you send data from frontend to server
client_socket.on('user_data', function(data){
//assuming you have sent a valid object with a parameter "name", let the client socket join the room
if(data != undefined){
client_socket.join(data.name); //here is the trick
}
});
});
now, if you want to send to all people with name "BRIAN", you can achieve it by doing this
io.of('/custom_namespace').broadcast.to('BRIAN').emit('some_event',some_data);
Second.
By saving the data on the module users and filter it using lodash library
sample code
var _lodash = require('lodash');
var Users = require('users');
var all_users = Users.getAll();
var socket_ids = [];
var users_with_name_brian = _lodash.filter(all_users, { name : "BRIAN" });
users_with_name_brian.forEach(function(user){
socket_ids.push(user.name);
});
now instead of emitting it one by one per iteration, you can do it like this in socketio
io.of('/custom_namespace').broadcast.to(socket_ids).emit('some_event',some_data);
Here is the link for lodash documentation
I hope this helps.
I am new to Meteor. I am using following code to read a file stored at server.
Client side
Meteor.call('parseFile', (err, res) => {
if (err) {
alert(err);
} else {
Session.set("result0",res[0]);
Session.set("result1",res[1]);
Session.set("result2",res[2]);
}
});
let longitude = Session.get("result0");
let latitude = Session.get("result1");
var buildingData = Session.get("result2");
Server Side
Meteor.methods({
'parseFile'() {
var csv = Assets.getText('buildingData.csv');
var rows = Papa.parse(csv).data;
return rows;
}
})
The problem is while I make a call it takes time to send the result back and hence wherever i am using latitude and longitude its giving undefined and page breaks. So, is there any solution to avoid this problem. One of the solution can be to make a synchronous call and wait for result to be returned.
You can make the server method run synchronously using the futures package, which should force the client to wait for the method to complete.
It might look something like this:
Meteor.methods({
'parseFile'() {
var future = new Future();
var csv = Assets.getText('buildingData.csv');
var rows = Papa.parse(csv).data;
future.return(rows);
future.wait();
}
});
This would require you installing the futures package linked above and setting up your includes properly in file containing your Meteor.methods() definitions. You might also look into good error handling inside your method.
UPDATE:
The link to the Future package is an NPM package, which you can read about here. The link above is to the atmosphere package, which looks like an old wrapper package.
I'm new at javascript and I've hit a wall hard here. I don't even think this is a Sequelize question and probably more so about javascript behavior.
I have this code:
sequelize.query(query).success( function(row){
console.log(row);
}
)
The var row returns the value(s) that I want, but I have no idea how to access them other than printing to the console. I've tried returning the value, but it isn't returned to where I expect it and I'm not sure where it goes. I want my row, but I don't know how to obtain it :(
Using Javascript on the server side like that requires that you use callbacks. You cannot "return" them like you want, you can however write a function to perform actions on the results.
sequelize.query(query).success(function(row) {
// Here is where you do your stuff on row
// End the process
process.exit();
}
A more practical example, in an express route handler:
// Create a session
app.post("/login", function(req, res) {
var username = req.body.username,
password = req.body.password;
// Obviously, do not inject this directly into the query in the real
// world ---- VERY BAD.
return sequelize
.query("SELECT * FROM users WHERE username = '" + username + "'")
.success(function(row) {
// Also - never store passwords in plain text
if (row.password === password) {
req.session.user = row;
return res.json({success: true});
}
else {
return res.json({success: false, incorrect: true});
}
});
});
Ignore injection and plain text password example - for brevity.
Functions act as "closures" by storing references to any variable in the scope the function is defined in. In my above example, the correct res value is stored for reference per request by the callback I've supplied to sequelize. The direct benefit of this is that more requests can be handled while the query is running and once it's finished more code will be executed. If this wasn't the case, then your process (assuming Node.js) would wait for that one query to finish block all other requests. This is not desired. The callback style is such that your code can do what it needs and move on, waiting for important or processer heavy pieces to finish up and call a function once complete.
EDIT
The API for handling callbacks has changed since answering this question. Sequelize now returns a Promise from .query so changing .success to .then should be all you need to do.
According to the changelog
Backwards compatibility changes:
Events support have been removed so using .on('success') or .success()
is no longer supported. Try using .then() instead.
According this Raw queries documentation you will use something like this now:
sequelize.query("SELECT * FROM `users`", { type: sequelize.QueryTypes.SELECT})
.then(function(users) {
console.log(users);
});
I have been searching for this particular problem for the past week, and since I couldn't find any information on the subject(that wasnt outdated), I just decided to work on other things. But now I am at the point where I need to be able to send data(that I constructed) to specific clients using their ID who are connected to my server using node.js and socket.io. I already store the ID in an object for each new connection. What I need to know is a way to just send it to a connection ID I choose.
Something like: function send(data, client.id) {};
I am using an http server, not TCP.
Is this possible?
edit:
server = http_socket.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/html'});
res.end(respcont);
client_ip_address = req.header('x-forwarded-for');
});
socket = io.listen(1337); // listen
//=============================================
// Socket event loop
//=============================================
socket.on ('connection', function (client_connect) {
var client_info = new client_connection_info(); // this just holds information
client_info.addNode(client_connect.id, client_connect.remoteAddress, 1); // 1 = trying to connet
var a = client_info.getNode(client_connect.id,null,null).socket_id; // an object holding the information. this function just gets the socket_id
client_connect.socket(a).emit('message', 'hi');
client_connect.on('message', function (data) {
});
client_connect.on ('disconnect', function () {
);
});
solution: I figured it out by just experimenting... What you have todo is make sure you store the SOCKET, not the socket.id (like i was doing) and use that instead.
client_info.addNode(client_connect.id, client_connect.remoteAddress, client_connect, 1)
var a = client_info.getNode(client_connect.id,null,null,null).socket;
a.emit('message', 'hi');
If you need to do this, the easiest thing to do is to build and maintain a global associative array that maps ids to connections: you can then look up the appropriate connection whenever you need and just use the regular send functions. You'll need some logic to remove connections from the array, but that shouldn't be too painful.
Yes, it is possible.
io.sockets.socket(id).emit('message', 'data');
Your solution has one major drawback: scaling. What will you do when your app needs more the one machine? Using internal IDs also could be difficult. I advice using external IDs (like usernames).
Similarly to this post I advice using rooms (together with Redis to allow scaling). Add private room for every new socket (basing on user's name for example). The code may look like this:
socket.join('priv/' + name);
io.sockets.in('priv/' + name).emit('message', { msg: 'hello world!' });
This solution allows multiple machines to emit events to any user. Also it is quite simple and elegant in my opinion.
The quick overview is this: for my web app I can write most of my functionality using CouchApp and CouchDB views, etc. I love the feature of CouchApp that pushes my code up to the server via replication- this makes the deployment cycle very easy.
However, to do some arbitrary work not supported in couchdb and works around a few limitations, I need to put a web platform in front of CouchDB. I'm considering building this in node.js because it uses JavaScript and I want to continue the easy deployment method of pushing code into the database.
Here's how i imagine it working:
- I write a web server/service in node.js using the normal method and the node command to start it.
- this sevice connects to couch db and gets a virtual list and a URL mapping list. This list is stored in redis for quick lookup. This list will tell the server, when it gets a request, based on host and path, etc, which handler is to be run.
- the server fetches the handler- which is just a document, it could be a design document or an arbitrary json document in couchdb. And then executes that handler to handle the request, as if I'd writte the handler as part of node js.
So the question is, how to get a son data structure that contains a JavaScript function in it, in text form, and execute that function?
This may be blindingly obvious, but i come from a compiled background, so normally there would be a compilation step here that makes this pretty much impossible.
So, what I'm thinking is in pseudo code:
Var string thecode = getValueForMapKey(handlerFunctionIWant);
somehowmagicallyexecute(thecode)
Is there an exec or run function that will do the magical execution step above in JavaScript?
It will run in the node.js context.
You can also use it in node, like this, as a dynamic function:
var cradle = require('cradle');
var db = new(cradle.Connection)().database('db_name');
db.get('_design/node%2Fyour_code', function (err, doc) {
if (!err){
var your_code = new Function(doc['arguments'].join(','), doc.code);
your_code("cool", "also cool");
}else{
console.error('error:', err);
}
});
make your docs look like this:
{
"_id": "_design/node/your_code",
"arguments": [
"nameOfArg1",
"nameOfArg2"
],
"code": "console.log('arg1', nameOfArg1); console.log('arg2', nameOfArg2);"
}
It's in the same scope as where the new Function is called, so you have access to cradle, or you can require other libs, which will be loaded as if it was an anon function in that scope.
Put it in a design doc, then only admin can make changes, out of the box.
Here is a nicer, but similar approach:
// Format, in db:
doc = {
"_id": "_design/node",
"your_function_name": {
"arguments": [
"nameOfArg1",
"nameOfArg2"
],
"code": "console.log('arg1', nameOfArg1); console.log('arg2', nameOfArg2);"
},
"your_other_function_name": {
"arguments": [
"name"
],
"code": "console.log('hello', name, 'how\\'s it going, bro?');"
}
};
var cradle = require('cradle');
var db = new(cradle.Connection)().database('db_name');
function run_from_db(name, args){
db.get('_design/node', function (err, doc) {
if (!err){
if (doc[name] !== undefined){
var fn = new Function(doc[name]['arguments'].join(','), doc[name].code);
fn.apply(fn, args);
}else{
console.error("could not find", name, "in _design/node");
}
}else{
console.error(err);
}
});
}
run_from_db('your_other_function_name', ['konsumer']);
this will output:
hello konsumer how's it going, bro?
eval(handlerFunctionIwant) is the call to execute it. You need to make sure that there's no way for hackers to inject code into that string, of course.
It is not clear to me if this will evaluate it in a context that has access to other javescript resources, such as the rest of node.js or access to your couchdb library.