I'm wondering to which extend should we add in an API values that can be calculated from the raw data and extra available information (from the browser session, user interface.. or whatever)
For example, we could have an API returning this JSON:
{
ownder_id: '123',
canAddComment: true,
etc...
}
That gives us the value "canAddComment" directly.
Or we could have just this:
{
ownder_id: '123',
etc...
}
Where, comments can be calculated from the owner_id. For example, a user can add comments if the session owner_id is different from the received from the API.
We could have done this in Javascript instead:
//supposing we have a session object with our browser session values
var params = {owner_id: session.owner_id};
$.post(apiURL, params, function(data){
var result = JSON.parse(data);
//processing the data
result["canAddComments"] = result.owner_id !== session.owner_id;
//do whatever with it
});
What would be the best approach? What's the recommendation in this kind of cases?
I'm not exactly sure what you want to know. But my first reaction is you make a function out of it.
// constructor
function Comment() {
this.owner_id;
this.canAddComment = function() {
return session.user_id === this.owner_id;
}
}
var session = {
user_id: 22,
name: 'Kevin'
};
var my_comment = new Comment();
my_comment.owner_id = 15;
if(my_comment.canAddComment()) {
$.ajax({
// now read the form and submit the data ...
})
}
else {
alert('Only the author of this message can update ...');
}
EDIT:
My question was mainly if this should better be calculated in the backend side or calculated after retrieving the API data.
Not sure if there is a generic answer. A few arguments: if you want the method to be secret, you must do it on server side. In other cases, I think it makes perfect sense to let the client PC use its processing power.
Okay, one example: sorting. Let's say there is a table full of data; clicking on the head sorts the table according to that column. Does it make sense to send all the data to the server, let it sort the table and return an array of keys? No, I think it makes more sense to let the client process this.
http://tablesorter.com/docs/
Similar with Google Maps: you drag a marker to some place, then the program must calculate the closest 5 bus stops. Well, obviously you calculate all of this on client side.
Related
I have an app that uses firebase, the whole stack pretty much, functions, database, storage, auth, messaging, the whole 9. I want to keep the client end very lightweight. So if a user comments on a post and "tags" another user, let's say using the typical "#username" style tagging, I moved all of the heavy lifting to the firebase functions. That way the client doesn't have to figure out the user ID based on the username, and do everything else. It is setup using triggers, so when the above scenario happens I write to a "table" called "create_notifications" with some data like
{
type: "comment",
post_id: postID,
from: user.getUid(),
comment_id: newCommentKey,
to: taggedUser
}
Where the taggedUser is the username, the postID is the active post, the newCommentKey is retrieved from .push() on the comments db reference, and the user.getUid() is from the firebase auth class.
Now in my firebase functions I have a "onWrite" trigger for that specific table that gets all of the relevant information and sends out a notification to the poster of the post with all the relevant details. All of that is complete, what I am trying to figure out is... how do I delete the incoming event, that way I don't need any sort of cron jobs to clear out this table. I can just grab the event, do my needed calculations and data gathering, send the message, then delete the incoming event so it never even really exists in the database except for the small amount of time it took to gather the data.
A simplified sample of the firebase functions trigger is...
exports.createNotification = functions.database.ref("/create_notifications/{notification_id}").onWrite(event => {
const from = event.data.val().from;
const toName = event.data.val().to;
const notificationType = event.data.val().type;
const post_id = event.data.val().post_id;
var comment_id, commentReference;
if(notificationType == "comment") {
comment_id = event.data.val().comment_id;
}
const toUser = admin.database().ref(`users`).orderByChild("username").equalTo(toName).once('value');
const fromUser = admin.database().ref(`/users/${from}`).once('value');
const referencePost = admin.database().ref(`posts/${post_id}`).once('value');
return Promise.all([toUser, fromUser, referencePost]).then(results => {
const toUserRef = results[0];
const fromUserRef = results[1];
const postRef = results[2];
var newNotification = {
type: notificationType,
post_id: post_id,
from: from,
sent: false,
create_on: Date.now()
}
if(notificationType == "comment") {
newNotification.comment_id = comment_id;
}
return admin.database().ref(`/user_notifications/${toUserRef.key}`).push().set(newNotification).then(() => {
//NEED TO DELETE THE INCOMING "event" HERE TO KEEP DB CLEAN
});
})
}
So in that function in the final "return" of it, after it writes the finalized data to the "/user_notifications" table, I need to delete the event that started the whole thing. Does anyone know how to do that? Thank you.
First off, use .onCreate instead of .onWrite. You only need to read each child when they are first written, so this will avoid undesirable side effects. See the documentation here for more information on the available triggers.
event.data.ref() holds the reference where the event occurred. You can call remove() on the reference to delete it:
return event.data.ref().remove()
The simplest way to achieve this is through calling the remove() function offered by the admin sdk,
you could get the reference to the notification_id through the event, i.e event.params.notification_id then remove it when need be with admin.database().ref('pass in the path').remove(); and you are good to go.
For newer versions of Firebase, use:
return change.after.ref.remove()
Dear all,
Im working with JS for some weeks and now I need a bit of clarification. I have read a lot of sources and a lot of Q&A also in here and this is what I learned so far.
Everything below is in connection with Node.js and Socket.io
Use of globals in Node.js "can" be done, but is not best practice, terms: DONT DO IT!
With Sockets, everything is treated per socket call, meaning there is hardly a memory of previous call. Call gets in, and gets served, so no "kept" variables.
Ok I build up some chat example, multiple users - all get served with broadcast but no private messages for example.
Fairly simple and fairly ok. But now I am stuck in my mind and cant wrap my head around.
Lets say:
I need to act on the request
Like a request: "To all users whose name is BRIAN"
In my head I imagined:
1.
Custom object USER - defined globally on Node.js
function User(socket) {
this.Name;
this.socket = socket; }
2.
Than hold an ARRAY of these globally
users = [];
and on newConnection, create a new User, pass on its socket and store in the array for further action with
users.push(new User(socket));
3.
And on a Socket.io request that wants to contact all BRIANs do something like
for (var i = 0; i < users.length; i++) {
if(user[i].Name == "BRIAN") {
// Emit to user[i].socket
}}
But after trying and erroring, debugging, googling and reading apparently this is NOT how something like this should be done and somehow I cant find the right way to do it, or at least see / understand it. can you please help me, point me into a good direction or propose a best practice here? That would be awesome :-)
Note:
I dont want to store the data in a DB (that is next step) I want to work on the fly.
Thank you very much for your inputs
Oliver
first of all, please don't put users in a global variable, better put it in a module and require it elsewhere whenever needed. you can do it like this:
users.js
var users = {
_list : {}
};
users.create = function(data){
this._list[data.id] = data;
}
users.get = function(user_id){
return this._list[user_id];
};
users.getAll = function(){
return this._list;
};
module.exports = users;
and somewhere where in your implementation
var users = require('users');
For your problem where you want to send to all users with name "BRIAN",
i can see that you can do this good in 2 ways.
First.
When user is connected to socketio server, let the user join a socketio room using his/her name.
so it will look like this:
var custom_namespace = io.of('/custom_namespace');
custom_namespace.on('connection', function(client_socket){
//assuming here is where you send data from frontend to server
client_socket.on('user_data', function(data){
//assuming you have sent a valid object with a parameter "name", let the client socket join the room
if(data != undefined){
client_socket.join(data.name); //here is the trick
}
});
});
now, if you want to send to all people with name "BRIAN", you can achieve it by doing this
io.of('/custom_namespace').broadcast.to('BRIAN').emit('some_event',some_data);
Second.
By saving the data on the module users and filter it using lodash library
sample code
var _lodash = require('lodash');
var Users = require('users');
var all_users = Users.getAll();
var socket_ids = [];
var users_with_name_brian = _lodash.filter(all_users, { name : "BRIAN" });
users_with_name_brian.forEach(function(user){
socket_ids.push(user.name);
});
now instead of emitting it one by one per iteration, you can do it like this in socketio
io.of('/custom_namespace').broadcast.to(socket_ids).emit('some_event',some_data);
Here is the link for lodash documentation
I hope this helps.
In the Parse JavaScript guide, on the subject of Relational Data it is stated that
By default, when fetching an object, related Parse.Objects are not
fetched. These objects' values cannot be retrieved until they have
been fetched.
They also go on to state that when a relation field exists on a Parse.Object, one must use the relation's query().find() method. The example provided in the docs:
var user = Parse.User.current();
var relation = user.relation("likes");
relation.query().find({
success: function(list) {
// list contains the posts that the current user likes.
}
});
I understand how this is a good thing, in terms of SDK design, because it prevents one from potentially grabbing hundreds of related records unnecessarily. Only get the data you need at the moment.
But, in my case, I know that there will never be a time when I'll have more than say ten related records that would be fetched. And I want those records to be fetched every time, because they will be rendered in a view.
Is there a cleaner way to encapsulate this functionality by extending Parse.Object?
Have you tried using include("likes")?
I'm not as familiar with he JavaScript API as the ObjC API.. so in the example below I'm not sure if "objectId" is the actual key name you need to use...
var user = Parse.User.current();
var query = new Parse.Query(Parse.User);
query.equalTo(objectId, user.objectId);
query.include("likes")
query.find({
success: function(user) {
// Do stuff
}
});
In general, you want to think about reverse your relationship. I'm not sure it is a good idea be adding custom value to the User object. Think about creating a Like type and have it point to the user instead.
Example from Parse docs:
https://parse.com/docs/js_guide#queries-relational
var query = new Parse.Query(Comment);
// Retrieve the most recent ones
query.descending("createdAt");
// Only retrieve the last ten
query.limit(10);
// Include the post data with each comment
query.include("post");
query.find({
success: function(comments) {
// Comments now contains the last ten comments, and the "post" field
// has been populated. For example:
for (var i = 0; i < comments.length; i++) {
// This does not require a network access.
var post = comments[i].get("post");
}
}
});
Parse.Object's {Parse.Promise} fetch(options) when combined with Parse.Promise's always(callback) are the key.
We may override fetch method when extending Parse.Object to always retrieve the relation's objects.
For example, let's consider the following example, where we want to retrieve a post and its comments (let's assume this is happening inside a view that wants to render the post and its comments):
var Post = Parse.Object.extend("Post"),
postsQuery = new Parse.Query(Post),
myPost;
postsQuery.get("xWMyZ4YEGZ", {
success: function(post) {
myPost = post;
}
).then(function(post) {
post.relation("comments").query().find({
success: function(comments) {
myPost.comments = comments;
}
});
});
If we had to do this every time we wanted to get a post and its comments, it would get very repetitive and very tiresome. And, we wouldn't be DRY, copying and pasting like 15 lines of code every time.
So, instead, let's encapsulate that by extending Parse.Object and overriding its fetch function, like so:
/*
models/post.js
*/
window.myApp = window.myApp || {};
window.myApp.Post = Parse.Object.extend("Post", {
fetch: function(options) {
var _arguments = arguments;
this.commentsQuery = this.relation("comments").query();
return this.commentsQuery.find({
success: (function(_this) {
return function(comments) {
return _this.comments = comments;
};
})(this)
}).always((function(_this) {
return function() {
return _this.constructor.__super__.fetch.apply(_this, _arguments);
};
})(this));
}
});
Disclaimer: you have to really understand how closures and IIFEs work, in order to fully grok how the above works, but here's what will happen when fetch is called on an existing Post, at a descriptive level:
Attempt to retrieve the post's comments and set it to the post's comments attribute
Regardless of the outcome of the above (whether it fails or not) operation, always perform the post's default fetch operation, and invoke all of that operation's callbacks
I am very new to mongodb and have a basic question that I am having trouble with. How do I get the ID field of a document that has already been created? I need the ID so i can update/add a new field to the document.
//newProfile is an object, one string it holds is called school
if(Schools.find({name: newProfile.school}).fetch().length != 1){
var school = {
name: newProfile.school
}
Meteor.call('newSchool', school);
//Method 1 (doesn't work)
var schoolDoc = Schools.findOne({name: newProfile.school});
Schools.update({_id: schoolDoc._id}, {$set: {enrolledStudents: Meteor.user()}});
//Method 2?
//Schools.update(_id: <what goes here?>, {$push: {enrolledStudents: Meteor.user()}});
}
else {
//Schools.update... <add users to an existing school>
}
I create a new school document if the listed school does not already exist. Schools need to hold an array/list of students (this is where i am having trouble). How do I add students to a NEW field (called enrolledStudents)?
Thanks!
I'm having some trouble understanding exactly what you're trying to do. Here's my analysis and understanding so far with a couple pointers thrown in:
if(Schools.find({name: newProfile.school}).fetch().length != 1){
this would be more efficient
if(Schools.find({name: new Profile.school}).count() != 1) {
Meteor.call('newSchool', school);
Not sure what you're doing here, unless you this will run asynchronously, meaning by the time the rest of this block of code has executed, chances are this Meteor.call() function has not completed on the server side.
//Method 1 (doesn't work)
var schoolDoc = Schools.findOne({name: newProfile.school});
Schools.update({_id: schoolDoc._id}, {$set: {enrolledStudents: Meteor.user()}});
Judging by the if statement at the top of your code, there is more than one school with this name in the database. So I'm unsure if the schoolDoc variable is the record you're after.
I believe you are having trouble because of the asynchronous nature of Meteor.call on the client.
Try doing something like this:
// include on both server and client
Meteor.methods({
newSchool: function (school) {
var newSchoolId,
currentUser = Meteor.user();
if (!currentUser) throw new Meteor.Error(403, 'Access denied');
// add some check here using the Meteor check/match function to ensure 'school'
// contains proper data
try {
school.enrolledStudents = [currentUser._id];
newSchoolId = Schools.insert(school);
return newSchoolId;
} catch (ex) {
// handle appropriately
}
}
});
// on client
var schoolExists = false;
if (Schools.findOne({name: newProfile.school})) {
schoolExists = true;
}
if (schoolExists) {
var school = {
name: newProfile.school
};
Meteor.call('newSchool', school, function (err, result) {
if (err) {
alert('An error occurred...');
} else {
// result is now the _id of the newly inserted record
}
})
} else {
}
Including the method on both the client and the server allows Meteor to do latency compensation and 'simulate' the insert immediately on the client without waiting for the server round-trip. But you could also just keep the method on the server-side.
You should do the enrolledStudents part on the server to prevent malicious users from messing with your data. Also, you probably don't want to actually be storing the entire user object in the enrolledStudents array, just the user _id.
For what you're trying to do, there is no need to get the _id. When you use update, just switch out the {_id: schoolDoc._id} with your query. Looks like using {name: newProfile.school} will work, assuming that the rest of your code does what you want it to do.
While that would work with the normal Mongo driver, I see that Meteor does not allow your update query to be anything but _id: Meteor throws throwIfSelectorIsNotId exception
First, make sure that you're pulling the right document, and you can try something like this:
var school_id = Schools.findOne({name: newProfile.school})._id;
Schools.update({_id: school_id}, { $push: { enrolledStudents: Meteor.user()}});
If that doesn't work, you'll have to do a little debugging to see what in particular about it isn't working.
I am working on a messaging system using node.js + cradle and couchdb.
When a user pulls a list of their messages, I need to pull the online status of the user that sent them the message. The online status is stored in the user document for each registered user, and the message info is stored in a separate document.
Here is the only way I can manage to do what I need, but its hugely inefficient
privatemessages/all key = username of the message recipient
db.view('privatemessages/all', {"key":username}, function (err, res) {
res.forEach(function (rowA) {
db.view('users/all', {"key":rowA.username}, function (err, res) {
res.forEach(function (row) {
result.push({onlinestatus:row.onlinestatus, messagedata: rowA});
});
});
});
response.end(JSON.stringify(result));
});
Can someone tell me the correct way of doing this?
Thank you
Your code could return empty result because you are calling response at the time when user statuses may not yet be fetched from DB. Other problem is that if I received multiple messages from the same user, then call for his status may be duplicit. Below is a function which first fetch messages from DB avoiding duplicity of users and then get their statuses.
function getMessages(username, callback) {
// this would be "buffer" for senders of the messages
var users = {};
// variable for a number of total users I have - it would be used to determine
// the callback call because this function is doing async jobs
var usersCount = 0;
// helpers vars
var i = 0, user, item;
// get all the messages which recipient is "username"
db.view('privatemessages/all', {"key":username}, function (errA, resA) {
// for each of the message
resA.forEach(function (rowA) {
user = users[rowA.username];
// if user doesn't exists - add him to users list with current message
// else - add current message to existing user
if(!user) {
users[rowA.username] = {
// I guess this is the name of the sender
name: rowA.username,
// here will come his current status later
status: "",
// in this case I may only need content, so there is probably
// no need to insert whole message to array
messages: [rowA]
};
usersCount++;
} else {
user.messages.push(rowA);
}
});
// I should have all the senders with their messages
// and now I need to get their statuses
for(item in users) {
// assuming that user documents have keys based on their names
db.get(item, function(err, doc) {
i++;
// assign user status
users[item].status = doc.onlineStatus;
// when I finally fetched status of the last user, it's time to
// execute callback and rerutn my results
if(i === usersCount) {
callback(users);
}
});
}
});
}
...
getMessages(username, function(result) {
response.end(JSON.stringify(result));
});
Although CouchDB is a great document database you should be careful with frequent updates of existing documents because it creates entirely new document version after each update (this is because of it's MVCC model which is used to achieve high availability and data durability). Consequence of this behavior is higher disk space consumption (more data/updates, more disk space needed - example), so you should watch it and run database consumption accordingly.
I think your system could use an in memory hashmap like memcached. Each user status entry would expire after a time limit.
Mapping would be
[user -> lasttimeseen]
If the hashmap contains the user, then the user is online.
On some certain actions, refresh the lasttimeseen.
Then instead of pinging the whole world each time, just query the map itself and return the result.
I'm reminded of this presentation:
Databases Suck for Messaging
And its quote from Tim O'Reilly:
"On monday friendfeed polled flickr nearly 3 million times for 45000 users, only 6K of whom were logged in. Architectural mismatch."
As pointed out in the other answers, updates in CouchDB are expensive and should be avoided if possible, and there's probably no need for this data to be persistent. A cache or messaging system may solve your problem more elegantly and more efficiently.