Parse.com Add User Pointer to a class in before save - javascript

I have a table called Roster where it stores a User Pointer in one of the column. I am trying set that column in before save method when it is new.So far i have this but i am not sure how to get the user id
Parse.Cloud.beforeSave("Roster",function(request,response){
//Update roster with sender Id
if (request.object.isNew()){
var userPointer = /*NOT SURE HOW TO GET WHO SENT IT*/
request.object.set("User",userPointer);
}
response.success();
});
Any idea?

var userPointer = request.user;
Is the proper method. Todd's answer works on the client side, but not on cloud code, where beforeSave triggers occur.
If you need to access any of the user's information beyond their id, you'll have to first fetch the user, as the entire object is not sent in the request.
Edit - Just wanted to add that before/afterSave triggers have a 3 second timeout. This is enough time to perform a quick query or two, but if you have a lot of objects in your database, or perform many save/fetch/query calls, you may end up exceeding your 3 second limit. If you have a lot of that logic that needs to occur, rather than saving the object from the client, call a cloud code function that handles all of those changes, then saves the object and returns the newly saved object, so that you can set your client side object to the returned, up to date object.

As Jake T pointed out, you will need to use
var user = request.user
Parse.User.current() is not supported in a cloud code environment.

Related

Firebase Realtime Databse difference between data snapshots

If I have a database structure like here and I make a query as shown below.Is there a difference on the traffic used to download the snapshot from the database if I access each node with snapshot.forEach(function(childSnapshot) and if I don't access the nodes?
If there is no difference, is there a way to access only the keys in Chats without getting a snapshot data for what each key contains.I'm assuming that this way it will generate less downloaded data
var requests = db.ref("Chats");
requests.on('child_added', function(snapshot) {
var communicationId = snapshot.key;
console.log("Chat id = " + communicationId);
getMessageInfo(
communicationId,
function() {
snapshot.ref.remove();
}
);
When you call requests.on('child_added', ...), you are always going to access all of the data at the requests node. It doesn't matter what you do in the callback function. The entire node is loaded into memory, and cost of the query is paid. What you do with the snapshot in memory doesn't cost anything else.
If you don't want all of the child nodes under requests, you should find some way to filter the query for only the children you need.
As they mentioned in the documentation, either of these methods can be used:
Call a method to get the data.
Set a listener to receive data-change
events.
Traffic depends upon our usage. When your data need not get updated in realtime, you can just call a method to get the data (1) But if you want your data to be updated in realtime, then you should go for (2). When you set a listener, Firebase sends your listener an initial snapshot of the data, and then another snapshot each time the child changes.
(1) - Example
firebase.database().ref('/users/').once('value') // Single Call
(2) - Example
firebase.database().ref('/users/').on('child_added') // Every Update It is Called
And also, I think you cannot get all keys, because when you reference a child and retrieve a data, firebase itself sends it as key-value pairs (DataSnapshot).
Further Reference: https://firebase.google.com/docs/reference/js/firebase.database.DataSnapshot
https://firebase.google.com/docs/database/web/read-and-write

Firebase concurrency issue: how to prevent 2 users from getting the same Game Key?

DATABASE:
SITUATION:
My website sells keys for a game.
A key is a randomly generated string of 20 characters whose uniqueness is guaranteed (not created by me).
When someone buys a key, NTWKeysLeft is read to find it's first element. That element is then copied, deleted from NTWKeysLeft and pasted to NTWUsedKeys.
Said key is then displayed on the buyer's screen.
PROBLEM:
How can I prevent the following problem :
1) 2 users buy the game at the exact same time.
2) They both get the same key read from NTWKeysLeft (first element in list)
3) And thus both get the same key
I know about Firebase Transactions already. I am looking for a pseudo-code/code answer that will point me in the right direction.
CURRENT CODE:
Would something like this work ? Can I put a transaction inside another transaction ?
var keyRef = admin.database().ref("NTWKeysLeft");
keyRef.limitToFirst(1).transaction(function (keySnapshot) {
keySnapshot.forEach(function(childKeySnapshot) {
// Key is read here:
var key = childKeySnapshot.val();
// How can I prevent two concurrent read requests from reading the same key ? Using a transaction to change a boolean could only happen after the read happens since I first need to read in order to know which key boolean to change.
var selectedKeyRef = admin.database().ref("NTWKeysLeft/"+key);
var usedKeyRef = admin.database().ref("NTWUsedKeys/"+key);
var keysLeftRef = admin.database().ref("keysLeft");
selectedKeyRef.remove();
usedKeyRef.set(true);
keysLeftRef.transaction(function (keysLeft) {
if (!keysLeft) {
keysLeft = 0;
}
keysLeft = keysLeft - 1;
return keysLeft;
});
res.render("bought", {key:key});
});
});
Just to be clear: keyRef.limitToFirst(1).transaction(function (keySnapshot) { does not work, but I would like to accomplish something to that effect.
Most depends on how you generate the keys, since that determines how likely collisions are. I recommend reading about Firebase's push IDs to get an idea how unique those are and compare that to your keys. If you can't statistically guarantee uniqueness of your keys or if statistical uniqueness isn't good enough, you'll have to use transactions to prevent conflicting updates.
The OP has changed the question a bit so, i will update the answer as follows: I will leave the bottom part about transactions as it was and will put the new update on top.
I can see two ways to proceed:
1) handle the lock system on your own and use JavaScript callbacks or other mechanisms for preventing simultaneous access to a portion of the code.
or
2) Use transactions/fireBase. On this case, i don't have the setup ready to share code other than sample/pseudo code provided at the bottom of this page.
With respect to option 1 above:
I have coded a use-case and put in on plunker. It uses JavaScript callbacks to queue users as they try to access the part of the code under lock.
I. user comes in and he is placed in queue
II. It then calls the callback function which pops users as
first come first out bases. I have the keys on top of the page to
be shared by the functions.
I have a button click event to this and when you click the button twice quickly, you will see keys assigned and they're different keys.
To read this code, click on the script.js file on the left and read starting from the bottom of the page where it calls the functions.
Here is the sample code in plunker. After clicking it, click on Run on top of the page and then click on the button on right hand side. Alert will pop up to show which key is given (note, there are two calls back to back to show two users coming in at same time)
https://plnkr.co/edit/GVFfvqQrlLeMaKlo5FCj?p=info
The fireBase transactions:
Use fireBase transactions to prevent concurrent read/write issues - below is the transaction() method signiture
transaction(dataToBeWritten, onComplete, applyLocally) returns fireBase.promise containing {
committed: boolean, nullable fireBase.database.snapshot }
Note, transaction needs writeOperation as first parameter and in your case looks like you’re removing a key upon success! hence the following function to be called in place of write
Try this pseudo code :
//first, get reference to your db
var selectedKeyRef = admin.database().ref("NTWKeysLeft/"+key);
// needed by transaction as first parameter
function writeOperation() {
selectedKeyRef.remove();
}
selectedKeyRef.transaction(function(writeOperation) , function(error,
committed, snapshot) {
  if (error) {
    console.log('Transaction failed abnormally!', error);
  } else if (!committed) {
    console.log('We aborted the transaction (because xyz).’);
  } else {
    console.log(‘keyRemoved!’);
  }
  console.log(“showKey: ", snapshot.val());
}); // end of the transaction() method call
Docs + to see parameters/return objects of the transaction() method see:
https://firebase.google.com/docs/reference/js/firebase.database.Reference#transaction
In the Docs.... If another client writes to the location before your new value is successfully written, your update function is called again with the new current value, and the write is retried.
https://firebase.google.com/docs/database/web/read-and-write#save_data_as_transactions
I don't think the problem you're worried about can happen. JavaScript, including Node, is single-threaded and can only do one thing at a time. If you had a big server infrastructure with more than one server running this code, then it would be possible, but for a single Node program, there's no problem.
Since none of the previous answers discussing the scope of Transactions worked out, I would suggest a different workaround.
Is it possible to trigger the unique code generation when someone buys a code? If yes, you could generate the unique string if the "buy" button is clicked, display the ID and save the ID to your database.
Later the user enters the key in your game, which checks if the ID is written in your database. This might probably also save a bit of data, since you do not need to keep track of the unique IDs before they get bought and you will also not run out of IDs, since they will always get generated when necessary.

Mirth channelMap in source JavaScript

In my source connector, I'm using javascript for my database work due to my requirements and parameters.
The end result is storing the data.
ifxResults = ifxConn.executeCachedQuery(ifxQuery); //var is declared
I need to use these results in the destination transformer.
I have tried channelMap.put("results", ifxResults);.
I get the following error ReferenceError: "channelMap" is not defined.
I have also tried to use return ifxResults but I'm not sure how to access this in the destination transformer.
Do you want to send each row as a separate message through your channel? If so, sounds like you want to use the Database Reader in JavaScript mode. Just return that ResultSet (it's really a CachedRowSet if you use executeCachedQuery like that) and the channel will handle the rest, dispatching an XML representation of each row as discrete messages.
If you want to send all rows in the result set aggregated into a single message, that will be possible with the Database Reader very soon: MIRTH-2337
Mirth Connect 3.5 will be released next week so you can take advantage of it then. But if you can't wait or don't want to upgrade then you can still do this with a JavaScript Reader:
var processor = new org.apache.commons.dbutils.BasicRowProcessor();
var results = new com.mirth.connect.donkey.util.DonkeyElement('<results/>');
while (ifxResults.next()) {
var result = results.addChildElement('result');
for (var entries = processor.toMap(ifxResults).entrySet().iterator(); entries.hasNext();) {
var entry = entries.next();
result.addChildElement(entry.getKey(), java.lang.String.valueOf(entry.getValue()));
}
}
return results.toXml();
I know this question is kind of old, but here's an answer just for the record.
For this answer, I'm assuming that you are using a Source connector type of JavaScript Reader, and that you're trying to use channelMap in the JavaScript Reader Settings editing pane.
The problem is that the channelMap variable isn't available in this part of the channel. It's only available in filters and transformers.
It's possible that what you want can be accomplished by using the globalChannelMap variable, e.g.
globalChannelMap.put("results", ifxResults);
I usually need to do this when I'm processing one record at a time and need to pass some setting to the destination channel. If you do it like I've done in the past, then you would first create a globalChannelMap key/value in the source channel's transformer:
globalchannelMap.put("ProcID","TestValue");
Then go to the Destinations tab and select your destination channel to make sure you're sending it to the destination (I've never tried this for channels with multiple destinations, so I'm not sure if anything different needs to be done).
Destination tab of source channel
Notice that ProcID is now listed in the Destination Mappings box. Click the New button next to the Map Variable box and you'll see Variable 1 appear. Double click on that and put in your mapping key, which in this case is ProcID.
Now go to your destination channel's source transformer. There you would enter the following code:
var SentValue = sourceMap.get("ProcID");
Now SentValue in your destination transformer has whatever was in ProcID when your source channel relinquished control.

jQuery Deferred returns only last value in loop

So I'm trying to go through one Firebase database to find entries in the database matching a criteria. Therefore I'm using the deferred object of jQuery to handle the database calls.
Once I get a return value from this first database I want to get the user info from a second database for each of those values in the first db. Then the results are added to a JSON array
so its:
<search for value, find one>
<<<search other db for oher info>>>
<continue search for outer value>
But this only returns one value - although everything else is running fine (and the console logs all the info correct).
Here's the code:
function find(searchLocation, profileID) {
var requestUserData = {
data: []
};
var def = $.Deferred();
//This will be executed as long as there are elements in the database that match the criteria and that haven't been loaded yet (so it's a simple loop)
Ref.orderByChild("location").equalTo(searchLocation).on("child_added", function(snapshot) {
def.ressolve(snapshot.val().ID);
});
return def.promise();
};
I hope you guys have any ideas on what to do or how I could solve this. Thanks in advance!
Edit: upon further testing I discovered that this problem already exists in the outer loop - so only the first value is being returned. I think this is related to the posission of the resolve() method but I didn't find a posibility on how to change this behaviour.
Firebase is a real-time database. The events stream as changes occur at the server. You're attempting to take this real-time model and force it into CRUD strategy and do a GET operation on the data. A better solution would be to simply update the values in real-time as they are modified.
See AngularFire, ReactFire, or BackboneFire for an example of how you can do this with your favorite bindings framework.
To directly answer the question, if you want to retrieve a static snapshot of the data, you want to use once() callback with a value event, not a real-time stream from child_added:
Ref.orderByChild("location").equalTo(searchLocation).once("value", function(snapshot) {
def.resolve(snapshot.val());
});

Sending extra, non-model data in a save request with backbone.js?

I'm looking for a solution for dealing with an issue of state between models using backbone.js.
I have a time tracking app where a user can start/stops jobs and it will record the time the job was worked on. I have a job model which holds the job's data and whether it is currently 'on'.
Only 1 job can be worked on at a time. So if a user starts a job the currently running job must be stopped. I'm wondering what the best solution to do this is. I mean I could simply toggle each job's 'on' parameter accordingly and then call save on each but that results in 2 requests to the server each with a complete representation of each job.
Ideally it would be great if I could piggyback additional data in the save request similarly to how it's possible to send extra data in a fetch request. I only need to send the id of the currently running job and since this really is unrelated to the model it needs to be sent alongside the model, not part of it.
Is there a good way to do this? I guess I could find a way to maintain a reference to the current job server side if need be :\
when you call a save function, the first parameter is an object of the data that's going to be saved. Instead of just calling model.save(), create an object that has the model data and your extra stuff.
inside of your method that fires off the save:
...
var data = this.model.toJSON();
data.extras = { myParam : someData };
this.model.save(data, {success: function( model, response ) {
console.log('hooray it saved: ', model, response);
});
...

Categories

Resources