I've imported a table containing a self referencing parent/child relationship into a class in Parse.com. Parse creates a new ID which I would now like to replace the UID with. I cannot seem to update all of the corresponding parent ID fields.
I've almost managed to get it going with some cloud code:
Parse.Cloud.define("updatePID", function(request, response) {
var _results={};
var query = new Parse.Query("ww_notes");
query
.limit(1000)
.find({
success: function(results){
for (var i=0; i<results.length; i++){
_results[results[i].get("uid")] = results[i].id;
};
for (var i=0; i<results.length; i++){
if (_results[results[i].get("pid")]){
results[i].set("test", _results[results[i].get("pid")]);
results[i].save();
};
};
response.success(results);
},
error: function(){
response.error("failed PID update");
}
});
})
This returns the corrected recordset but doesn't save it in the database. Note: test is just a test field to get this function working before switching to updating the PID. I've tried to simplify the problem by moving a static write into the first loop:
for (var i=0; i<results.length; i++){
_results[results[i].get("uid")] = results[i].id;
results[i].set("test", "things");
results[i].save();
};
This only updated 4 of 245 records. The recordset sent back was correctly updated but the Parse data store was not. I've created an equivalent job which resulted in 27 of the 245 records being updated. The job returned successfully.
Stephen
The problem you're having is almost certainly related to how you handle the asynch saves.
The reason you're getting only a few saves does is that the code launches them off asynchronously in a tight loop, then returns. You're getting a few saves done while the loop is finishing, then no more once you run response.whatever.
The fix is to batch the saves and use promises like this...
// ...
query.limit(1000);
query.find().then(function(results) {
for (var i=0; i<results.length; i++){
_results[results[i].get("uid")] = results[i].id;
};
// we'll save everything in this array
var toSave = [];
for (var i=0; i<results.length; i++){
if (_results[results[i].get("pid")]){
results[i].set("test", _results[results[i].get("pid")]);
toSave.push(results[i]);
};
};
// return a promise that's complete when everything is saved
return Parse.Object.saveAll(toSave);
}).then(function(results) {
response.success(results);
}, function(error) {
response.error(error);
});
Related
I'm trying to get tweets from Twitter, and then get the hashtags from those tweets and get images from Flickr.
I want the tweets and images to be united together. However, see the console logs at the end. I expect the first one will output the current tweet and the second will output the images retrieved for this tweet.
However, what happens is that console.log(tweets[i]) always prints out the last tweet in the list, while console.log(results) prints the current results (i.e. every flickr result is printed).
By the way, the tweets and flicks are being retrieved from a json file for now.
tweets$.subscribe((tweets) => {
for (var i in tweets) {
var hashtags = tweets[i].entities.hashtags;
for (var j in hashtags) {
var flicks$ = this.flickrService.getImagesMock(hashtag[j]);
flicks$.subscribe((results) => {
console.log(tweets[i]);
console.log(results);
});
}
}
});
So my question is, how do I get the tweets[i] in the $flicks.subscribe to refer to the i that was in use when the subscription was created?
I guess it's a clasical problem with scope in async js.
for (var i in tweets) {
(function(index) {
var hashtags = tweets[index].entities.hashtags;
for (var j in hashtags) {
var flicks$ = this.flickrService.getImagesMock(hashtag[j]);
flicks$.subscribe((results) => {
console.log(tweets[index]);
console.log(results);
});
}
})(i);
}
Basically, in your example nested subscribe is executed after first loop is already finished.
I need to scan the trip array and calculate the travel time between the current trip with each trip in the array and select the shortest one. For calculation i need to send google maps api call.
I am very confused about the asynchronous callback function .
Can anyone help me on this how to send api call within for loop and check the results and continue?
Thank you.
The trips are in my array list;
Array :
array=[trip1,trip2, trip3,....];
JS :
function assigntrips(array){
var triplist = [];
for(var i=0; i< array.length; i++){
var fstnode = array[i];
for(var j=i+1; j<array.length; j++){
//here i want to get the response from google api and decide if i want to choose the trip.
if not the for loop continues and send another api call.
}
}
}
function apicall(inputi, cb){
var destination_lat = 40.689648;
var destination_long = -73.981440;
var origin_lat = array[inputi].des_lat;
var origin_long = array[inputi].des_long;
var departure_time = 'now';
var options = {
host: 'maps.googleapis.com',
path: '/maps/api/distancematrix/json?origins='+ origin_lat +','+origin_long+ '&destinations=' + office_lat + ',' + office_long + '&mode=TRANSIT&departure_time=1399399424&language=en-US&sensor=false'
}
http.get(options).on('response',function(response){
var data = '';
response.on('data',function(chunk){
data += chunk;
});
response.on('end',function(){
var json = JSON.parse(data);
console.log(json);
var ttltimereturnoffice = json.rows[0].elements[0].duration.text;
//var node = new Node(array[i],null, triptime,0,ttltimereturnoffice,false);
//tripbylvtime.push(node);
cb(ttltimereturnoffice + '\t' + inputi);
});
});
}
You cannot check the results in the loop. The loop is in the past, the callbacks happen in the future - you can't change that. There are only two things you can do, and one is an abstraction of the other:
1) You can create your callback in such a manner that it will collect the results and compare them when all are present.
2) You can use promises to do the same thing.
The #1 approach would look something like this (while modifying the cb call in your code appropriately):
var results = [];
function cb(index, ttltimereturnoffice) {
results.push([index, ttltimereturnoffice]);
if (results.length == array.length) {
// we have all the results; find the best one, display, do whatever
}
}
I don't quite know what library you are using, and if it supports promises, but if http.get returns a promise, you can do #2 by collecting the promises into an array, then using the promise library's all or when or similar to attach a callback on all gets being done.
so i have a messaging app using parse.com as my backend. When i send a message from the app it saves it on Parse.com to a class called "NewMessages". Then in my cloud code i have an afterSave function dedicated to this class so that when a new object gets saved to "NewMessages" it picks a random user attaches it to the message and saves it in a new class called "Inbox". Then it deletes the original message from "NewMessages".
So the "NewMessages" class should always be empty right? But when I send a bunch of messages very quickly some get skipped over. How do i fix this?
Is there a better way to structure this than using afterSave?
function varReset(leanBody, leanSenderName, leanSenderId, randUsers){
leanBody = "";
leanSenderName = "";
leanSenderId = "";
randUsers = [];
console.log("The variables were set");
}
Parse.Cloud.afterSave("Lean", function(leanBody, leanSenderName, leanSenderId, randUsers, request) {
varReset(leanBody, leanSenderName, leanSenderId, randUsers);
var query = new Parse.Query("NewMessages");
query.first({
success: function(results){
leanBody = (results.get("MessageBody"));
leanSenderName = (results.get("senderName"));
leanSenderId = (results.get("senderId"));
getUsers(leanBody, leanSenderName, leanSenderId);
results.destroy({
success: function(results){
console.log("deleted");
}, error: function(results, error){
}
});
}, error: function(error){
}
});
});
function getUsers(leanBody, leanSenderName, leanSenderId, response){
var query = new Parse.Query(Parse.User);
query.find({
success: function(results){
var users = [];
console.log(leanBody);
console.log(leanSenderName);
//extract out user names from results
for(var i = 0; i < results.length; ++i){
users.push(results[i].id);
}
for(var i = 0; i < 3; ++i){
var rand = users[Math.floor(Math.random() * users.length)];
var index = users.indexOf(rand);
users.splice(index, 1);
randUsers.push(rand);
}
console.log("The random users are " + randUsers);
sendMessage(leanBody, leanSenderName, leanSenderId, randUsers);
}, error: function(error){
response.error("Error");
}
});
}
function sendMessage(leanBody, leanSenderName, leanSenderId, randUsers){
var Inbox = Parse.Object.extend("Inbox");
for(var i = 0; i < 3; ++i){
var inbox = new Inbox();
inbox.set("messageBody", leanBody);
inbox.set("senderName", leanSenderName);
inbox.set("senderId", leanSenderId);
inbox.set("recipientId", randUsers[i]);
console.log("leanBody = " + leanBody);
console.log("leanSenderName = " + leanSenderName);
console.log("leanSenderId = " + leanSenderId);
console.log("recipient = " + randUsers[i]);
inbox.save(null, {
success: function(inbox) {
// Execute any logic that should take place after the object is saved.
alert('New object created with objectId: ' + inbox.id);
},
error: function(inbox, error) {
// Execute any logic that should take place if the save fails.
// error is a Parse.Error with an error code and message.
alert('Failed to create new object, with error code: ' + error.message);
}
});
}
}
Have you checked your logs? You may be falling afoul of resource limits (https://parse.com/docs/cloud_code_guide#functions-resource). If immediacy is not important, it may be worth looking into set up a background job that runs every few minutes and tackles undelivered messages. It may also be possible to combine the two approaches: having the afterSave function attempt to do an immediate delivery to Inboxes, while the background job picks up any NewMessages left over on a regular basis. Not the prettiest solution but at least you have a bit more reliability. (You'll have to think about race conditions though where the two may attempt deliveries on the same NewMessage.)
Regarding your question about a better structure, if the two classes are identical (or close enough), is it possible to just have a Messages class? Initially the "to" field will be null but is assigned a random recipient on a beforeSave function. This may be faster and neater.
EDIT: Adding a 3rd observation which was originally a comment:
I saw that you are using a Query.first() in afterSave in order to find the NewMessage to take care of. Potentially, a new NewMessage could have snuck in between the time afterSave was called, and the Query was run. Why not get the ID of the saved NewMessage and use that in the Query, instead of first()?
query.get(request.object.id,...);
This ensures that the code in afterSave handles the NewMessage that it was invoked for, not the one that was most recently saved.
I'm building a javascript application using object oriented techniques and I'm running into a problem that I hope someone here can help me resolve.
The following method is designed to return an array populated with rows of data from a web SQL database:
retrieveAllStoreSearches : function(){
this.db.transaction(
function(transaction){
transaction.executeSql(
"SELECT name,store,address FROM searchResults ORDER BY name ASC",
[],
function(transaction, results){
var returnArr = [];
for(var i = 0; i < results.rows.length; i++){
var row = results.rows.item(i);
returnArr.push(row.name + ' | ' + row.address);
}
console.log('Length of returnArr: ' + returnArr.length);
console.log(returnArr);
return returnArr;
},
this.errorHandler
);
}
);
}
This works exactly as expected when logging the results to the console BUT when I try to call the method in the following snippet (located in a different script - which initialises all objects and is responsible for building the application DOM structure and functionality)
console.log(db.retrieveAllStoreSearches());
undefined is returned.
I can't figure out what I am doing wrong as when I have used return in a method to allow an object to be accessed from one class and into a different script I have never encountered any problems.
Could anyone provide any pointers on what I might be doing wrong?
Cannot be done, if your function is calling an asynchronous function, the only way to return results is through a callback. That's the whole point of asynchronous functions, the rest of the code can keep going before the call is finished. It's a different way of thinking about returning values (without blocking the rest of your code).
So you'd have to change your code to the following (plus proper error handling)
retrieveAllStoreSearches : function(callback){
this.db.transaction(
function(transaction){
transaction.executeSql(
"SELECT name,store,address FROM searchResults ORDER BY name ASC",
[],
function(transaction, results){
var returnArr = [];
for(var i = 0; i < results.rows.length; i++){
var row = results.rows.item(i);
returnArr.push(row.name + ' | ' + row.address);
}
callback( returnArr );
},
this.errorHandler
);
}
);
}
Then you can use console.log like the following
db.retrieveAllStoreSearches(function(records) {
console.log(records )
});
Part of a website I am working on is a video page. I am pulling the videos from a YouTube account by accessing the YouTube Data API. Grabbing the videos in no particular order and not sorted works fine, but when I try to sort them into categories, I start running into trouble. Let's say there are three categories, Fruit, Vegetable, Pets. Instead of grabbing all the videos at once, I want to grab all the videos tagged with Fruit, append them to a <ul id="Fruit">. Then request all videos tagged with Vegetable, etc.
When starting out, I had the browser alert when it had finished getting the request and then appending the appropriate list. After I took out the alert, it still worked, but not the way I expected. Either the loop is advancing too quickly, or not advancing at all, but I can't seem to spot the mistake. What ends up happening is that all the videos get put into one list, <ul id="Vegetable">.
Please note: I am using a plugin called jGFeed which wraps the jQuery getJSON function, so I believe you can treat it as such.
var videoCategories = ['Fruit', 'Vegetable', 'Pets'];
for (var i = 0; i < videoCategories.length; i++) {
var thisCategory = videoCategories[i];
$.jGFeed('http://gdata.youtube.com/feeds/api/users/username/uploads?category='+thisCategory,
//Do something with the returned data
function(feeds) {
// Check for errors
if(!feeds) {
return false;
} else {
for(var j=0; j < feeds.entries.length(); j++) {
var entry = feeds.entries[i];
var videoUrl = entry.link;
$('ul#'+thisCategory).append('<li>'+entry.title+'</li>');
}
});
}
The problem is, you're using the 'thisCategory'-variable to set the category-name. The problem is, the value if this variable changes, while you're waiting for a response from the server.
You could try to put the whole script inside a function:
var videoCategories = ['Fruit', 'Vegetable', 'Pets'];
for (var i = 0; i < videoCategories.length; i++) {
getCategory(videoCategories[i]);
}
function getCategory(thisCategory)
{
$.jGFeed('http://gdata.youtube.com/feeds/api/users/username/uploads?category='+thisCategory,
//Do something with the returned data
function(feeds) {
// Check for errors
if(!feeds) {
return false;
} else {
for(var j=0; j < feeds.entries.length(); j++) {
var entry = feeds.entries[i];
var videoUrl = entry.link;
$('ul#'+thisCategory).append('<li>'+entry.title+'</li>');
}
});
}
I haven't tested this, so I'm not sure if it works..