I have probably misunderstood the documentation somehow but cannot figure this out.
What I want to do, is to be able to create a new ClientContact and save it to an array of pointers called contacts in Clients table.
This is the relevant code:
var Client = Parse.Object.extend("Client");
var selectedClient = new Client();
// sets the objectId based on URL params
selectedClient.id = $routeSegment.$routeParams.id;
var ClientContact = Parse.Object.extend("ClientContact");
var contact = new ClientContact();
contact.set('name', 'test');
contact.set('desc', 'some description');
contact.set('phoneNumber', '123');
selectedClient.add('contacts', contact);
selectedClient.save().then(function() {
console.log('saved');
}, function(error) {
console.error(error);
});
As expected, the contact is automatically saved and added to contacts when the selectedClient is saved.
But if I run the same code again (in the testing this means refreshing the page), a new ClientContact is saved but it replaces the contacts array entirely.
That is, only the most recent ClientContact is associated to the Client, any new additions replaces the array, leaving only one pointer.
I hope there is an obvious an easy fix that I have simply failed to spot.
Ok seems I have located the problem.
I was assuming that array operations on an object was independant on the local representation.
What I did above was to construct an object based on a known objectId:
var Client = Parse.Object.extend("Client");
var selectedClient = new Client();
// sets the objectId based on URL params
selectedClient.id = $routeSegment.$routeParams.id;
Turns out that array operations like add and remove are performed based on the local information. So doing the above construction leaves my array, in the selectedClient object, empty at each page refresh. This was the cause of my array being replaced with a new single valued array.
In short, one should fetch an object before performing array operations it seems.
The reason for the construction above was to avoid having to re-query the Client object when navigating around an AngularJS based webpage. The solution was fortunately simple: loading the Client object once in a parent scope.
This moves outside the scope of the question. Just wanted to mention that as well.
Related
I have taken some key points from this site already when it comes to transferring data to and from local storage. However, it does not seem to work properly. In short, I am trying to load data in the form of an array containing objects, to and from the local storage. I am aware that it is saved as a string within local storage and have then used JSON formatting to solve that problem, it is also where I think the problem might be when there is objects within objects. However, in its current state, it does not seem to work. Any ideas?
var students = [{name: "Petrina", age: "20"}];
function saveList(){
localStorage.setItem('somekey', JSON.stringify(students));
console.log("Saved to Local");
}
function loadList(){
students = JSON.parse(localStorage.getItem('somekey'));
}
The code gives no errors, I am using the functions in relation to loading the window.
window.onload = () => { loadList() }
You've added that you are calling from onload, and,
in your code you are loading into students.
add to the beginning of your code:
var students;.
Note: Some Objects need special handling.
For example: Date:
var date = new Date();
localStorage.date = JSON.stringify(date);
date = new Date(JSON.parse(localStorage.date));
Answer before additional information:
key is a method of localStorage.
Don't use it.
Use students instead, for example.
Storage.key()
(This is assuming that you call the functions)
var students = [{name: "Petrina", age: "20"}];
function saveToLocalStorage(key, value){
localStorage.setItem(key, JSON.stringify(value));
console.log("Saved to LocalStorage");
}
function loadFromLocalStorage(key){
console.log("Loaded from LocalStorage");
return JSON.parse(localStorage.getItem(key));
}
console.log("students: "+students);
saveToLocalStorage("stu", students);
var st2=loadFromLocalStorage("stu");
console.log("st2: "+st2);
cannot be run in a snippet: sandboxed, no access to localStorage - cross origin.
It got solved! - I am unsure what the problem was. I cleared all the calls for all functions and debugged the save and load function 1 at a time whilst watching the local storage data. The problem to begin with was that it simply did not save the data that got updated during runtime, hence it just kept loading values that always were the same. #iAmOren's idea by creating a function that returns a value might have done it, however I am unsure why. Thanks for the responses & Support!
Are you missing a call to your saveList function?
Calling saveList before calling loadList works as you expect.
Please refer: https://codesandbox.io/s/strange-brown-3mmeq?file=/index.html
In my source connector, I'm using javascript for my database work due to my requirements and parameters.
The end result is storing the data.
ifxResults = ifxConn.executeCachedQuery(ifxQuery); //var is declared
I need to use these results in the destination transformer.
I have tried channelMap.put("results", ifxResults);.
I get the following error ReferenceError: "channelMap" is not defined.
I have also tried to use return ifxResults but I'm not sure how to access this in the destination transformer.
Do you want to send each row as a separate message through your channel? If so, sounds like you want to use the Database Reader in JavaScript mode. Just return that ResultSet (it's really a CachedRowSet if you use executeCachedQuery like that) and the channel will handle the rest, dispatching an XML representation of each row as discrete messages.
If you want to send all rows in the result set aggregated into a single message, that will be possible with the Database Reader very soon: MIRTH-2337
Mirth Connect 3.5 will be released next week so you can take advantage of it then. But if you can't wait or don't want to upgrade then you can still do this with a JavaScript Reader:
var processor = new org.apache.commons.dbutils.BasicRowProcessor();
var results = new com.mirth.connect.donkey.util.DonkeyElement('<results/>');
while (ifxResults.next()) {
var result = results.addChildElement('result');
for (var entries = processor.toMap(ifxResults).entrySet().iterator(); entries.hasNext();) {
var entry = entries.next();
result.addChildElement(entry.getKey(), java.lang.String.valueOf(entry.getValue()));
}
}
return results.toXml();
I know this question is kind of old, but here's an answer just for the record.
For this answer, I'm assuming that you are using a Source connector type of JavaScript Reader, and that you're trying to use channelMap in the JavaScript Reader Settings editing pane.
The problem is that the channelMap variable isn't available in this part of the channel. It's only available in filters and transformers.
It's possible that what you want can be accomplished by using the globalChannelMap variable, e.g.
globalChannelMap.put("results", ifxResults);
I usually need to do this when I'm processing one record at a time and need to pass some setting to the destination channel. If you do it like I've done in the past, then you would first create a globalChannelMap key/value in the source channel's transformer:
globalchannelMap.put("ProcID","TestValue");
Then go to the Destinations tab and select your destination channel to make sure you're sending it to the destination (I've never tried this for channels with multiple destinations, so I'm not sure if anything different needs to be done).
Destination tab of source channel
Notice that ProcID is now listed in the Destination Mappings box. Click the New button next to the Map Variable box and you'll see Variable 1 appear. Double click on that and put in your mapping key, which in this case is ProcID.
Now go to your destination channel's source transformer. There you would enter the following code:
var SentValue = sourceMap.get("ProcID");
Now SentValue in your destination transformer has whatever was in ProcID when your source channel relinquished control.
I have a like Button on Profile Page, On click of Like Button i want to maintain an array of like and store it into my db.
in profile Controller I have
$scope.likeProfile = UserService.likeProfile(loggedInUser,$state.params.userId);
In User Service I have
function likeProfile(likedBy,id){
var likeArray = [];
likeArray.push(likedBy);
ref.child("users").child(id).update({ likedBy: likeArray});
}
I just want to understand how I could not intialize likeArray everyTime LikeProfile Method is called. So that all likes are pushed into array.
I would do it like this, not sure though if this is what you want to achieve. Initialization should not be done within the function if you want to keep the result. This keeps the result within the scope of one user.
$scope.likeArray = [];
function likeProfile (likedBy, id) {
$scope.likeArray.push(likedBy);
ref.child("users").child(id).update({ likedBy: likeArray});
}
Otherwise, if you need overall likes you have to initialize the array with the previous values within the function, sth. like
likeArray = getLikesFromDB(); // however you access your db
likeArray.push(likedBy);
ref.child("users").child(id).update({ likedBy: likeArray });
Try to debug with the browser dev tools (e.g. in Chrome Tools > Developer Tools) and see what you send in the network tab. If your array only contains one value the whole array will be overwritten unless you write a custom update function that adds a value rather than replaces the value.
I have a node in Firebase getting continually updated with information from a logfile. The node is lines/ and each child of lines/ is from a post() so it has a unique ID.
When a client first loads, I want to be able to grab the last X number of entries. I expect I'll do this with once(). From then on, however, I want to use an on() with child_added so that I get all new data. However, child_added gets all data stored in the Firebase and, after the initial setup, only want the new stuff.
I see that I can add a limitToLast() on the on(), but, if I say limitToLast(1) and a flood of entries come in, will my app still get all the new entries? Is there some other way to do this?
You need to include a timestamp property and run a query.
// Get the current timestamp
var now = new Date().getTime();
// Create a query that orders by the timestamp
var query = ref.orderByChild('timestamp').startAt(now);
// Listen for the new children added from that point in time
query.on('child_added', function (snap) {
console.log(snap.val()
});
// When you add this new item it will fire off the query above
ref.push({
title: "hello",
timestamp: Firebase.ServerValue.TIMESTAMP
});
The Firebase SDK has methods for ordering, orderByChild() and methods for creating a range startAt(). When you combine the two you can limit what comes back from Firebase.
I think there is a problem in #David East's solution. He is using the local timestamp which may cause problem if the time is not accurate in client device. Here is my suggested solution (iOS Swift):
Using observeSingleEvent to get the complete data set
Then returned it in reversed order by reversed()
Get the last timestamp by for example data[0].timestamp
Using queryStarting for timestamp
self._dbref.queryOrdered(byChild: "timestamp").queryStarting(atValue: timestamp+1)
.observe(.childAdded, with: {
snapshot in
print(snapshot.value)
})
You have the right idea. child_added should be called only for the new nodes. Without source code it's hard to tell why you get all the data in your child_added event.
You can check the chat demo app to see how they load new chat messages. The use case sounds similar.
https://github.com/firebase/firechat/blob/master/src/js/firechat.js#L347
Here's temporary but quick solution:
// define a boolean
var bool = false;
// fetch the last child nodes from firebase database
ref.limitToLast(1).on("child_added", function(snap) {
if (bool) {
// all the existing child nodes are restricted to enter this area
doSomething(snap.val())
} else {
// set the boolean true to doSomething with newly added child nodes
bool = true;
}
});
Disadvantage: It will load all the child nodes.
Advantage: It will not process existing child nodes but just the newly added child nodes.
limitToLast(1) will do the work.
I have a Player class. Players can have x number of Trophies. I have the Player objectId and need to get a list of all of their Trophies.
In the Parse.com data browser, the Player object has a column labeled:
trophies Relation<Trophy>
(view Relations)
This seems like it should be so simple but I'm having issues with it.
I have the ParseObject 'player' in memory:
var query = new Parse.Query("Trophy");
query.equalTo("trophies", player);
query.find({
/throws an error- find field has invalid type array.
I've also tried relational Queries:
var relation = new Parse.Relation(player, "trophies");
relation.query().find({
//also throws an error- something about a Substring being required.
This has to be a completely common task, but I can't figure out the proper way to do this.
Anyone know how to do this in Javscript CloudCode?
Many thanks!
EDIT--
I can do relational queries on the user fine:
var user = Parse.User.current();
var relation = user.relation("trophies");
relation.query().find({
I don't understand why this very same bit of code breaks if I'm using a non-user object.
I finally sorted this out, though there is a caveat that makes this work differently than the documentation would indicate.
//Assuming we have 'player', an object of Class 'Player'.
var r = player.relation("trophies");
r.query().find({
success: function(trophies){
response.success(trophies); //list of trophies pointed to by that player's "trophies" column.
},
error: function(error){
response.error(error);
}
})
The caveat: You must have a 'full' player object in memory for this to work. You can't save a player object, grab the object from the success callback and have this work. Some reason, the object that is returned in the success handler is appears to be an incomplete Parse.Object, and is missing some of the methods required to do this.
Another stumbling block about the Parse.com Javascript SDK- A query that finds nothing is still considered successful. So every time you query for something, you must check the length of the response for greater than 0, because the successful query could have returned nothing.
This is what worked for me:
var Comment = Parse.Object.extend("Comment");
var commentsQuery = new Parse.Query(Comment);
commentsQuery.equalTo("parent", video);//for the given 'video' instance, find its children (comments)
commentsQuery.descending("createdAt");
return commentsQuery.find();
Note: of course, video is an instance of the Video class. And returning find() means I'll have to handle the 'promise' in whatever function calls this one.
And here is another function coming from the other angle:
getRecentCommentsOfAllVideos: function(limit) {
var Comment = Parse.Object.extend("Comment");
var commentsQuery = new Parse.Query(Comment);
commentsQuery.descending("createdAt");
commentsQuery.limit(limit);
commentsQuery.include("parent");//this enables the details of the comment's associated video to be available in the result
return commentsQuery.find();
}
(Be sure to read https://parse.com/docs/js_guide and search for the line that says "Include the post data with each comment".)
I also looked at these materials; maybe they'll help you (though they weren't much help for me):
https://parse.com/questions/how-to-retrieve-parent-objects-with-their-children-in-one-call-javascript-api
http://blog.parse.com/2011/12/06/queries-for-relational-data/