I have the following structure on my firebase database:
{
"gateways_pr" :{
"gateway_1":{
"avisos" : {
"00":{
"aviso_1" : "0",
"aviso_2" : "0"
},
"01":{
"aviso_1" : "0",
"aviso_2" : "0"
}
}
}
}
}
I have a small demo javascript webPage that is listening to child_change in gateways_pr/gateway_1/avisos:
var gateWayRef = firebase.database().ref("gateways_pr/gateway_1/avisos");
gateWayRef.on('child_changed',function(data){
console.log("CHILD_CHANGE");
console.log(data.val());
var datos = data.val();
console.log(datos);
});
When I changed for example gateway_1/avisos/00/aviso_1 and set it to 2 I can track the change with chrome developer tools looking into the frames of the websocket and I receive:
{"t":"d","d":{"b":{"p":"gateways_pr/gateway_1/avisos/00/aviso_1","d":"2"},"a":"d"}}
So I´m only receiving the change made.
The problem is that, on my code, data.val() has the following value:
{aviso_1: "2", aviso_2: "0"}
Calling data.ref.path.toString() returns :
/gateways_pr/gateway_1/avisos/00
That means that the Firebase API shows you everything below the children that had it´s property changed (00 in this case).
it´s there anyway of knowing what was the change (in this case should return "aviso_1")?
The only workaround I´ve found so far is making my code to listen on every child. In this case I should listen to gateways_pr/gateway_1/avisos/00 and gateways_pr/gateway_1/avisos/01, but if I add new entries to "avisos" I should start listening to them too, and at the end my program could end listening to thousand of referecenes.
When you attach a child_changed listener to gateways_pr/gateway_1/avisos, you're asking the Firebase client to inform you when something changes in a child under that level. If something changes on a lower level, the Firebase client will raise the child_changed event on the level that you registered for. There is no way to change this behavior.
When you have the need to know precisely what changed under the listener, it typically means that you've modeled the data wrong for your use-case.
For example: if you want to listen for changes across the entire hierarchy, you should model a list of changes across the entire hierarchy and then attach a listener to that list. This is one of the many reasons that the Firebase documentation recommends keeping flat data structures.
Related
i am using firebase to make checkout system where multiple users of same store can override/update a node at same time.
Scenario/Steps:
User1: Enter item code in system (and he can enter multiple)
User2: Enter item code in system (//)
User3: Release that item code from system
User4 ... and so on.
Code:
"-KhBgsi8HwT5BloV0Srt" : {
"lastUpdated" : 1504285854767,
"methodName" : "ITEM_ENTERED",
"payLoad": "{'Id':1, 'ItemName': 'Apples'}"
}
In above code, whenever any user enters i override above node with methodeName and payLoad of item. ANd whenever user releases item from the system I again update same node by overriding like that:
"-KhBgsi8HwT5BloV0Srt" : {
"lastUpdated" : 1504285854767,
"methodName" : "ITEM_RELEASED",
"payLoad": "{'Id':1, 'ItemName': 'Apples'}"
}
All users are connected to same firebase node which can be override by above users at same time. So if all users do operation at same time the node gets the last one saved in the node. How can i avoid that and make sure all users get same data not the last one happened on node.
At some time when everything is happening quickly by users then above node gets mixed up with ITEM_ENTERED/ITEM_RELEASED methods and clients get out of sync.
I hope i made my point clear. I just need a right direction to fix this concurrency writes to same node.
Any help is appreciated.
Thanks
Comments are getting long, but this seems to be a valid solution to your problem
Just push another node, and make your listener returns the last pushed node
Rather than using child_changed, use child_added
DATABASE:
SITUATION:
My website sells keys for a game.
A key is a randomly generated string of 20 characters whose uniqueness is guaranteed (not created by me).
When someone buys a key, NTWKeysLeft is read to find it's first element. That element is then copied, deleted from NTWKeysLeft and pasted to NTWUsedKeys.
Said key is then displayed on the buyer's screen.
PROBLEM:
How can I prevent the following problem :
1) 2 users buy the game at the exact same time.
2) They both get the same key read from NTWKeysLeft (first element in list)
3) And thus both get the same key
I know about Firebase Transactions already. I am looking for a pseudo-code/code answer that will point me in the right direction.
CURRENT CODE:
Would something like this work ? Can I put a transaction inside another transaction ?
var keyRef = admin.database().ref("NTWKeysLeft");
keyRef.limitToFirst(1).transaction(function (keySnapshot) {
keySnapshot.forEach(function(childKeySnapshot) {
// Key is read here:
var key = childKeySnapshot.val();
// How can I prevent two concurrent read requests from reading the same key ? Using a transaction to change a boolean could only happen after the read happens since I first need to read in order to know which key boolean to change.
var selectedKeyRef = admin.database().ref("NTWKeysLeft/"+key);
var usedKeyRef = admin.database().ref("NTWUsedKeys/"+key);
var keysLeftRef = admin.database().ref("keysLeft");
selectedKeyRef.remove();
usedKeyRef.set(true);
keysLeftRef.transaction(function (keysLeft) {
if (!keysLeft) {
keysLeft = 0;
}
keysLeft = keysLeft - 1;
return keysLeft;
});
res.render("bought", {key:key});
});
});
Just to be clear: keyRef.limitToFirst(1).transaction(function (keySnapshot) { does not work, but I would like to accomplish something to that effect.
Most depends on how you generate the keys, since that determines how likely collisions are. I recommend reading about Firebase's push IDs to get an idea how unique those are and compare that to your keys. If you can't statistically guarantee uniqueness of your keys or if statistical uniqueness isn't good enough, you'll have to use transactions to prevent conflicting updates.
The OP has changed the question a bit so, i will update the answer as follows: I will leave the bottom part about transactions as it was and will put the new update on top.
I can see two ways to proceed:
1) handle the lock system on your own and use JavaScript callbacks or other mechanisms for preventing simultaneous access to a portion of the code.
or
2) Use transactions/fireBase. On this case, i don't have the setup ready to share code other than sample/pseudo code provided at the bottom of this page.
With respect to option 1 above:
I have coded a use-case and put in on plunker. It uses JavaScript callbacks to queue users as they try to access the part of the code under lock.
I. user comes in and he is placed in queue
II. It then calls the callback function which pops users as
first come first out bases. I have the keys on top of the page to
be shared by the functions.
I have a button click event to this and when you click the button twice quickly, you will see keys assigned and they're different keys.
To read this code, click on the script.js file on the left and read starting from the bottom of the page where it calls the functions.
Here is the sample code in plunker. After clicking it, click on Run on top of the page and then click on the button on right hand side. Alert will pop up to show which key is given (note, there are two calls back to back to show two users coming in at same time)
https://plnkr.co/edit/GVFfvqQrlLeMaKlo5FCj?p=info
The fireBase transactions:
Use fireBase transactions to prevent concurrent read/write issues - below is the transaction() method signiture
transaction(dataToBeWritten, onComplete, applyLocally) returns fireBase.promise containing {
committed: boolean, nullable fireBase.database.snapshot }
Note, transaction needs writeOperation as first parameter and in your case looks like you’re removing a key upon success! hence the following function to be called in place of write
Try this pseudo code :
//first, get reference to your db
var selectedKeyRef = admin.database().ref("NTWKeysLeft/"+key);
// needed by transaction as first parameter
function writeOperation() {
selectedKeyRef.remove();
}
selectedKeyRef.transaction(function(writeOperation) , function(error,
committed, snapshot) {
if (error) {
console.log('Transaction failed abnormally!', error);
} else if (!committed) {
console.log('We aborted the transaction (because xyz).’);
} else {
console.log(‘keyRemoved!’);
}
console.log(“showKey: ", snapshot.val());
}); // end of the transaction() method call
Docs + to see parameters/return objects of the transaction() method see:
https://firebase.google.com/docs/reference/js/firebase.database.Reference#transaction
In the Docs.... If another client writes to the location before your new value is successfully written, your update function is called again with the new current value, and the write is retried.
https://firebase.google.com/docs/database/web/read-and-write#save_data_as_transactions
I don't think the problem you're worried about can happen. JavaScript, including Node, is single-threaded and can only do one thing at a time. If you had a big server infrastructure with more than one server running this code, then it would be possible, but for a single Node program, there's no problem.
Since none of the previous answers discussing the scope of Transactions worked out, I would suggest a different workaround.
Is it possible to trigger the unique code generation when someone buys a code? If yes, you could generate the unique string if the "buy" button is clicked, display the ID and save the ID to your database.
Later the user enters the key in your game, which checks if the ID is written in your database. This might probably also save a bit of data, since you do not need to keep track of the unique IDs before they get bought and you will also not run out of IDs, since they will always get generated when necessary.
I want to make a homepage where several pieces of data are published, but only when the user first visits the page : one would get the latest 10 articles published but that's it - it won't keep changing.
Is there a way to make the inbuilt pub/sub mechanism turn itself off after a set amount of time or number of records, or another mechanism?
Right now I'm using a very simple setup that doesn't "turn off":
latestNews = new Mongo.Collection('latestNews');
if (Meteor.isClient) {
Meteor.subscribe("latestNews");
}
if (Meteor.isServer) {
Meteor.publish('latestNews', function() {
return latestNews.find({}, {sort: { createdAt: -1 }, limit : 10});
});
}
The pub/sub pattern as it is implemented in Meteor is all about reactive data updates. In your case that would mean if the author or last update date of an article changes then users would see this change immediately reflected on their home page.
However you want to send data once and not update it ever again.
Meteor has a built-in functionality to handle this scenario : Methods. A method is a way for the client to tell the server to execute computations and/or send pure non-reactive data.
//Server code
var lastTenArticlesOptions = {
sort : {
createdAt : -1
},
limit : 10
}
Meteor.methods({
'retrieve last ten articles' : function() {
return latestNews.find({}, lastTenArticlesOptions).fetch()
}
})
Note that contrary to publications we do not send a Mongo.Cursor! Cursors are used in publications as a handy (aka magic) way to tell the server which data to send.
Here, we are sending the data the data directly by fetching the cursor to get an array of articles which will then be EJSON.stringifyied automatically and sent to the client.
If you need to send reactive data to the client and at a later point in time to stop pushing updates, then your best bet is relying on a pub/sub temporarily, and then to manually stop the publication (server-side) or the subscription (client-side) :
Meteor.publish('last ten articles', function() {
return latestNews.find({}, lastTenArticlesOptions)
})
var subscription = Meteor.subscribe('last ten articles')
//Later...
subscription.stop()
On the server-side you would store the publication handle (this) and then manipulate it.
Stopping a subscription or publication does not destroy the documents already sent (the user won't see the last ten articles suddenly disappear).
I have thig angularJS frontend and I use express, node and mongo on the backend.
My situation looks like:
//my data to push on server
$scope.things = [{title:"title", other proprieties}, {title:"title", other proprieties}, {title:"title", other proprieties}]
$scope.update = function() {
$scope.things.forEach(function(t) {
Thing.create({
title: t.title,
//other values here
}, function() {
console.log('Thing added');
})
})
};
//where Thing.create its just an $http.post factory
The HTML part looks like:
//html part
<button ng-click="update()">Update Thing</button>
Then on the same page the user has the ability to change the $scope.things and my problem is that when I call update() again all the things are posted twice because literally thats what I'm doing.
Can someone explain me how to check if the 'thing' its already posted to the server just to update the values ($http.put) and if its not posted on server to $http.post.
Or maybe its other way to do this?
I see a few decisions to be made:
1) Should you send the request after the user clicks the "Update" button (like you're currently doing)? Or should you send the request when the user changes the Thing (using ngChange)?
2) If going with the button approach for (1), should you send a request for each Thing (like you're currently doing), or should you first check to see if the Thing has been updated/newly created on the front end.
3) How can you deal with the fact that some Thing's are newly created and others are simply updated? Multiple routes? If so, then how do you know which route to send the request to? Same route? How?
1
To me, the upside of using the "Update" button seems to be that it's clear to the user how it works. By clicking "Update" (and maybe seeing a flash message afterwards), the user knows (and gets visual feedback) that the Thing's have been updated.
The cost to using the "Update" button is that there might be unnecessary requests being made. Network communication is slow, so if you have a lot of Thing's, having a request being made for each Thing could be notably slow.
Ultimately, this seems to be a UX vs. speed decision to me. It depends on the situation and goals, but personally I'd lean towards the "Update" button.
2
The trade-off here seems to be between code simplicity and performance. The simpler solution would just be to make a request for each Thing regardless of whether it has been updated/newly created (for the Thing's that previously existed and haven't changed, no harm will be done - they simply won't get changed).
The more complex but more performant approach would be to keep track of whether or not a Thing has been updated/newly created. You could add a flag called dirty to Thing's to keep track of this.
When a user clicks to create a new Thing, the new Thing would be given a flag of dirty: true.
When you query to get all things from the database, they all should have dirty: false (whether or not you want to store the dirty property on the database or simply append it on the server/front end is up to you).
When a user changes an existing Thing, the dirty property would be set to true.
Then, using the dirty property you could only make requests for the Thing's that are dirty:
$scope.things.forEach(function(thing) {
if (thing.dirty) {
// make request
}
});
The right solution depends on the specifics of your situation, but I tend to err on the side of code simplicity over performance.
3
If you're using Mongoose, the default behavior is to add an _id field to created documents (it's also the default behavior as MongoDB itself as well). So if you haven't overridden this default behavior, and if you aren't explicitly preventing this _id field from being sent back to the client, it should exist for Thing's that have been previously created, thus allow you to distinguish them from newly created Thing's (because newly created Thing's won't have the _id field).
With this, you can conditionally call create or update like so:
$scope.things.forEach(function(thing) {
if (thing._id) {
Thing.update(thing._id, thing);
}
else {
Thing.create(thing);
}
});
Alternatively, you could use a single route that performs "create or update" for you. You can do this by setting { upsert: true } in your update call.
In general, upsert will check to see if a document matches the query criteria... if there's a match, it updates it, if not, it creates it. In your situation, you could probably use upsert in the context of Mongoose's findByIdAndUpdate like so:
Thing.findByIdAndUpdate(id, newThing, { upsert: true }, function(err, doc) {
...
});
See this SO post.
#Adam Zemer neatly addressed concerns I raised in a comment, however I disagree on some points.
Firstly, to answer the question of having an update button or not, you have to ask yourself. Is there any reason why the user would like to discard his changes and not save the work he did. If the answer is no, then it is clear to me that the update should not be place and here is why.
To avoid your user from loosing his work you would need to add confirmations if he attempts to change the page, or close his browser, etc. On the other if everything is continuously saved he has the peace of mind that his work is always saved and you dont have to implement anything to prevent him from loosing his work.
You reduce his workload, one less click for a task may seem insignificant but he might click it many time be sure to have his work save. Also, if its a recurrent tasks it will definitely improve his experience.
Performance wise and code readability wise, you do small requests and do not have to implement any complicated logic to do so. Simple ng-change on inputs.
To make it clear to him that his work is continuously save you can simply say somewhere all your changes are saved and change this to saving changes... when you make a request. For exemple uses, look at office online or google docs.
Then all you would have to do is use the upsert parameter on your mongoDB query to be able to create and update your things with a single request. Here is how your controller would look.
$scope.update = function(changedThing) { // Using the ng-change you send the thing itself in parammeter
var $scope.saving = true; // To display the saving... message
Thing.update({ // This service call your method that update with upsert
title: changedThing.title,
//other values here
}).then( // If you made an http request, I suppose it returns a promise.
function success() {
$scope.saving = false;
console.log('Thing added');
},
function error() {
//handle errors
})
};
I'm having nightmares on how to display on the UI what I have changed in my database. I have this scenario that I need to select certain titles and then I will click a button that will change its status.
Problem is when I select titles and then I click the change status button it don't automatically reflect on the UI. Here is my update function.
$scope.updateTitleStatus = function(statusId, cp){
ContentAssessmentService.updateSelectedTitles($scope.selectedTitles, statusId);
$scope.selAll = !$scope.selAll;
$scope.selectedTitles = [];
};
Here is my service.
this.updateSelectedTitles = function(selectedTitle, statusId){
var self = this;
_.forEach(selectedTitle, function(selectedTitle){
ContentAssessmentFactory.updateSelectedTitleStatus(selectedTitle.id, statusId);
});
};
Here is my array which is the selected title stored.
$scope.selectedTitles = [];
Can you tell me how to use $watch function? I don't know how to do it. I've done this but it doesn't work.
$scope.$watch(function($scope){
return $scope.selectedTitles;
}, function(newValue) {
$scope.selectedTitles = newValue;
console.log(newValue);
});
I just need to update the UI immediately without refreshing the page (that's my last option but trying not to) when I have click the change status button.
You are going to have to use polling or a websocket connection. $watch does not "watch" your database. It watches #scope variables that are usually bound to the view and reacts to changes there. It sounds like you are looking for something more like meteor.js that keeps an open websocket and will dynamically update the view when the database is changed from another client, background process etc. These are completely different things. To achieve this sort of behavior with angular, the easiest approach would be to poll your api incrementally and update models in angular when the api gives you modified data.