Here is some code that partially works and partially doesn't.
I tried keep(as much as possible) only the parts relevant to my question.
See my concerns after the code.
Parse.Cloud.define
("myCloudFunction", function(request, response)
{
var recordTypeArray,className,recordListQuery,resultDictionary;
recordTypeArray = ["LT1","LT2","LT3","RT1","RT2","RT3"];
resultDictionary = [];
console.log("Trace-One");
watchFunction(recordTypeArray,0,resultDictionary).then
(function(resRcd) {
console.log("Trace-Two");
response.success(resultDictionary);
});
console.log("Trace-Three");
});
function watchFunction(typeArray,typeNumber,resDico)
{
var className,recordListQuery;
className = "AA_".concat(typeArray[typeNumber]).concat("_ZZ");
recordListQuery = new Parse.Query(className);
return (recordListQuery.find().then
(function(resRcd) {
// Here some problemless code.
if (typeNumber++==typeArray.length) return promise(function(){});
return watchFunction(typeArray,typeNumber,resDico)
})
);
}
I am doing something wrong concerning the way I handle promises, but I don't know what.
I want to see Trace-One, then watchFunction do its job (this part actually works fine) an finally see Trace-Two, before performing response.success.
But what happens is I see Trace-One, then I see Trace-Three, then I can see in the log that watchFunction has done its job as it should. And I never see Trace-Two.
And as one could expect I get a message complaining that success/error was not called
So why am I not seeing Trace-Two and jumping to Trace-Three?
I presume I am not returning a promise from somewhere correctly.
I wish someone can point out where my mistake is.
It looks like you want watchFunction to perform several queries, one for each class name that can be derived from typeArray. But if those queries succeed, watchFunction is guaranteed to crash by indexing typeArray out of bounds. The function also drops the results, assigning them to a never-referenced dummy argument.
A simpler way to produce those several queries is to map each class name to a promise to query that class. Parse.Promise.when() will run all of queries and be fulfilled with a (var arg) array of arrays containing the results. (Parse.Query.or() ought to do the same, combining the results in a single array).
So, fix watchFunction:
// create a collection of promises to query each class indicated
// by type array, return a promise to run all of the promises
function watchFunction(typeArray) {
var promises = [];
for (i=0; i<typeArray.length; ++i) {
var className = "AA_".concat(typeArray[i]).concat("_ZZ");
var recordListQuery = new Parse.Query(className);
promises.push(recordListQuery.find());
}
return Parse.Promise.when(promises);
}
The cloud function should also call response error, and can be cleaned up a little bit...
Parse.Cloud.define("myCloudFunction", function(request, response) {
var recordTypeArray = ["LT1","LT2","LT3","RT1","RT2","RT3"];
console.log("Trace-One");
watchFunction(recordTypeArray).then(function() {
console.log("Trace-Two");
response.success(arguments);
}, function(error) {
console.log("Trace-Two (error)");
response.error(error);
});
console.log("Trace-Three");
});
You should expect to see Trace-One, Trace-Three, Trace-Two in the logs, since the "Trace-Two" logs happen after the queries finish.
Related
I'm not super versed in JS promises though I generally know enough to be dangerous. I'm working on Vue Method that handles searching a large data object present in the this.data() - Normally when I make asynchronous requests via axios this same formatting works fine but in this case I have to manually create a promise to get the desired behavior. Here is a sample of the code:
async searchPresets() {
if (this.presetSearchLoading) {
return
}
this.presetSearchLoading = true; // shows spinner
this.presetSearchResults = []; // removes old results
this.selectedPresetImports = []; // removes old user sections from results
// Need the DOM to update here while waiting for promise to complete
// otherwise there is no "loading spinner" feedback to the user.
const results = await new Promise(resolve => {
let resultSet = [];
for (var x = 0; x < 10000; x++) {
console.log(x);
}
let searchResults = [];
// do search stuff
resolve(searchResults);
});
// stuff after promise
}
The thing is, the stuff after promise works correctly. It awaits the resolution before executing and receives the proper search result data as it should.
The problem is that the DOM does not update upon dispatching the promise so the UI just sits there.
Does anyone know what I'm doing wrong?
Try $nextTick():
Vue 2.1.0+:
const results = await this.$nextTick().then(() => {
let resultSet = []
for (var x = 0; x < 10000; x++) {
console.log(x)
}
let searchResults = []
// do search stuff
return searchResults
});
Any Vue:
const results = await new Promise(resolve => {
this.$nextTick(() => {
let resultSet = []
for (var x = 0; x < 10000; x++) {
console.log(x)
}
let searchResults = []
// do search stuff
resolve(searchResults)
})
})
So it turns out I kind of said it all when I said, "I'm not super versed in JS promises though I generally know enough to be dangerous.".
I appreciate the attempts to help me through this but it turns out that making something a promise does not inherently make it asyncronous. This was my mistake. The problem wasn't that Vue was not updating the DOM, the problem was that the promise code was executing synchronously and blocking - thus because execution never actually stopped to await, Vue had no perogative to update the DOM.
Once I wrapped my promise code in a setTimout(() => { /* all code here then: */ resolve(searchResults); }, 200); Everything started working. I guess the set time out allows the execution to stop long enough for vue to change the dom based on my previous data changes. The script still technically blocks the UI while it runs but at least my loader is spinning during this process which is good enough for what I'm doing here.
See: Are JavaScript Promise asynchronous?
Vue will look for data changes and collect them into an array to tell if the DOM needs to be rerendered afterward. This means that everything in Vue is event(data)-driven. Your function only defines behavior that has no data binding to the V-DOM. So the Vue engine will do nothing since nothing in their dependant data set has changed.
I see your Promise function is going to resolve the response to a variable "searchResults". If your DOM uses that variable, the Vue engine will collect the change after the Promise's done. You may put a property in "data()" and bind it to DOM.
For example:
<span v-for="(res, key) in searchResults" :key="key">
{{ res.name }}
</span>
...
<script>
export default {
...
data () {
return { searchResults: [] }
},
...
}
</script>
I wanted to use rxjs for the first time but am a bit stucked 'cause it doesn't behave exactly like I want it to: In my scenario I want to create an observable from a promise. But I want the promise only being called once (not on every subscription) and I want it not being called on creation time (defer the call to the first subscription).
First I tried this:
var source = Rx.Observable.fromPromise(_this.getMyPromise())
which causes a call to the getMyPromise function right on creation time. This is not satisfying because at that time I don't know if the source really will be used.
Then I tried:
var source = Rx.Observable.defer(function() { return _this.getMyPromise() })
which causes a call to the getMyPromise function each time a new subscription is being made to source. This makes way too many unnecessary calls to the web server. The Rx.Observable.create function seems to have the same issue.
So what is left or what am I missing?
.shareReplay() does this, e.g.:
var source = Rx.Observable.defer(function() { return _this.getMyPromise() }).shareReplay();
If you're using rxjs5, you'll want to read: Pattern for shareReplay(1) in RxJS5
In answer to your comment below, I can think of a fairly straightforward extension to the above logic that will do what you want, but it has a caveat. Let's say the events you want to use to trigger a "refresh" are represented in a stream, s$, then you could do something like:
var source = Rx.Observable.of({}).concat(s$)
.flatMapLatest(function() {
return Rx.Observable.defer(function() {
return _this.getMyPromise()
})
})
.shareReplay(1)
What we have here is a stream starting with a dummy object to get things rolling, followed by a stream consisting of your refresh events. Each of these is projected into a new observable created from a fresh invocation of your getMyPromise method, and the whole thing is flattened into a single stream. Finally, we keep the shareReplay logic so we only actually make calls when we should.
The caveat is that this will only work properly if there's always at least one subscriber to the source (the first subscription after all others are disposed will run the promise again, and will receive both the previously-cached value and the result of the promise it caused to run).
Here is an answer that does not require at least one subscriber at the source at all times using a simple helper:
var _p = null;
var once = function() { return _p || (_p = _this.getMyPromise());
var source = Rx.Observable.defer(once);
Or if you're using lodash, you can _.memoize your getMyPromise and get this automatically.
I'm working on a geoprocessing web application. My application will provide users with a specific set of options, the user will provide some data, and then I will process the data on the server and finally return the results. If it matters, I am using the CMV http://docs.cmv.io/en/1.3.3/ as a framework and trying to build my own plugin, but I suspect my problems are more general JS problems. Here is a pseudocode sample (note that this is pseudocode and not my actual code, which is a mess at the moment):
initializeTool: function() {
//here I am able to access my map object through this.map
//and I need it for my output
on(dom.byId("mybutton"), "click", processInput);
}
processInput: function() {
//pull user data from webpage
var userData, queries;
//launch query for all data
for(var i in userData){
queries[i] = query(userData[i]);
}
//deferredlist is from Dojo, doc here: http://dojotoolkit.org/api/?qs=1.10/dojo/DeferredList
new DeferredList(queries).then(function (results) {
//iterate over query responses and perform work
for(var i in queries){
//peform some synchronus operations
}
//and now we're done! but how do I get to my output?
}
}
The desired output in this case is a group of objects that have had various operations done on them, but are only accessible in the scope of the then() block and the inline function. My problem is that the output I am trying to use is only in the scope of the initialize function. I'm not sure what the best way to get my processed data to where I want it to be. This is a problem because the processed data is geometry information - it isn't very human readable as text, so it needs to be displayed on a map.
I've been pouring over JS scoping and looking at references to try and figure out what my issue is, but I seriously cannot figure it out.
One of the main points of promises is that then returns a promise for whatever is eventually returned inside its onFulfill handler. This is what enables you to get the outcome out of your processInput() function and into the world outside it.
So you can (and should) do this:
function processInput() {
//pull user data from webpage
var userData;
//launch query for all data
return Promise.all(userData.map(query))
.then(function (results) {
var theResult;
//iterate over query responses and perform work
results.forEach(function (result) {
//peform some synchronus operations and determine theResult
});
return theResult;
});
}
processInput().then(function (theResult) {
// do something with theResult
});
I'm using the javascript sdk plugin for facebook to create a feed on my webpage.
The problem is that sometimes during load the feed gets unordered, even if i have setup a callback chain.
I think it gets unordered because sometimes the "second" async call gets processed faster than the "first" async call.
This is the first time i've been using callbacks, am i doing it right?
How can i solve the feed gets unordered if some calls finish faster than others?
The code below is only the relevant code and is under working status.
function initFeed(){
FB.api('/{id}/feed', function(response){
var feedArray = response.data;
$.each(feedArray, function(){
var $this = $(this)[0]; //Status Object for single Status in Feed
setStatus($this, processStatus); //processStatus is function defined below
});
});
}
function setStatus(statusObject, callbackProcessStatus){
FB.api("/{personId}?fields=id,link,name,picture",
function (response) {
var html = /* Generates html based from statusObject and response */
callbackProcessStatus(html);
});
}
function processStatus(html){
$('#fb-status-wrapper').append(html);
}
(was uncertain on the title of this post, please edit if you think it is not descriptive enough)
Best regards
This is a somewhat common problem with parallel async calls. The simplest solution requires promises. I recommend the Bluebird promise library, but most will do fine.
var fbApi = function(url){
return new Promise(function(resolve, reject){
FB.api(url, function(resp){ resolve(resp); });
});
}
function setStatus(statusObject){
return fbApi("/{personId}?fields=id,link,name,picture")
.then(function(response){
var html = ...;
return html;
});
}
function getFeedItemPromises(){
return fbApi("/{id}/feed").then(function(response){
return response.data.map(function(item){
});
});
}
Depending on your needs, initFeed could be one of these. The first renders the feed when all items are available, and the second renders it when each item is available, but enforces the order.
function initFeed(){
return Promise.all(getFeedItemPromises())
.then(function(itemsHtml){
// append all of the items at once
$('#fb-status-wrapper').append(itemsHtml.join("\n"));
});
}
Or this which ensures the order, but eagerly appends items to the feed, after all previous items have been added.
function initFeed(){
function renderItem(html){
$('#fb-status-wrapper').append(html);
}
// reduce can be used to chain promises in sequence
return getFeedItemPromises().reduce(function(p, nextPromise){
return p.then(function(){ return nextPromise })
.then(renderItem);
}, Promise.resolve())
}
An alternative would be to create a div for each item which acts as a placeholder, keep those in an array, and fill them in when each resolves. This works especially well if you know the height of the items beforehand, and fade them in when they load. From a UX perspective, this is the best in my opinion.
I would not recommend the above if you don't know the heights of items, as it'll cause headache inducing shifting of items as new ones are inserted.
Indeed you cannot rely on the order in which the requests will finish. The only way to be sure, is to only call the second one if the first one is done. But that will slow down the loading quite a lot.
Another possibility is to remember for each request which one it is, and insert the items in the right order (insert before a 'later' one, even if that one was received earlier).
I think the easiest way to do that, is to make placeholders for the items inside the each loop, so the placeholders are inserted in the right order. When the requests return, you just place the responses in the right placeholder.
It could look somewhat like this. 2 extra lines and a couple of tiny changes. I couldn't test this without the API, but I hope you get the idea.
function initFeed(){
FB.api('/{id}/feed', function(response){
var feedArray = response.data;
$.each(feedArray, function(index){
var $this = $(this)[0]; //Status Object for single Status in Feed
// Make a container per item inside the wrapper.
var $itemContainer = $('<div></div>');
$('#fb-status-wrapper').append($itemContainer);
// Pass the container to the api function.
setStatus($this, processStatus, $itemContainer); //processStatus is function defined below
});
});
}
function setStatus(statusObject, callbackProcessStatus, $container){
FB.api("/{personId}?fields=id,link,name,picture",
function (response) {
var html = /* Generates html based from statusObject and response */
// Pass the item place holder/container to the processing procedure.
callbackProcessStatus(html, $container);
});
}
function processStatus(html, $container){
$container.append(html);
}
My problem is as follows :
I have many Mysql requests to do in Node, and it's done asynchronously.
In the following example, I would like to wait for the checkExists function to finish one way or another (and populate my input variable) before the function doStuffWithInput starts. I don't see any other way than pasting doStuffWithInput multiple times in the various possible callbacks (after each 'input=keys;') ... I'm sure there is a better way though. Any ideas?
var input;
db.checkExists_weekParents(id,function(count){ //check table existence/number of rows
if(count!==='err'){ //if doesnt exist, create table
db.create_weekParents(id,function(info){
if(info!=='err'){ //as table is empty, create input from a full dataset
db.makeFull_weekParents(id,function(keys){
input = keys;
});
}
});
}else{ //if exists, check number of entries and create input keys as a subset of the full dataset
db.makeDiff_weekParents(id,function(keys){
if(keys.length!==0){
input = keys;
}else{ //if the table already has full dataset, we need to export and start again.
db.export_weekParents(id,function(info){
db.create_weekParents(id,function(info){
if(info!=='err'){
db.makeFull_weekParents(id,function(keys){
input = keys;
});
}
});
});
}
});
}
});
Once all this is done, we have lots of stuff to do (spawn child processes, more db operations, etc...)
doStuffWithInput(input,function(output){
//Tons of stuff here
console.log(output);
})
I really hope this is clear enough, I'll clarify if needed.
EDIT
Trying to rewrite using promises seems the best way to go, and I imagine it can be a great example for others like me struggling with pyramid of doom.
So far I have :
var Q = require('q');
function getInput(){
var dfd = Q.defer();
db.check_weekParents(id,function(count){
console.log('count '+count);
if(count==='err'){
db.create_weekParents(id,function(info){
if(info!=='err'){
console.log('created table');
db.makeDiff_weekParents(id,function(keys){
input = keys;
dfd.resolve(input);
});
}
});
}else{
db.makeDiff_weekParents(id,function(keys){
input=keys;
dfd.resolve(input);
});
}
});
return dfd.promise;
}
getInput().then(function (input) {
console.log(input);
});
It is magic!!
You can use promises rather than callbacks. There are many possibilities in node, and the mysql library you are using may even support them. For example with Q:
function getInput(){
var dfd = Q.defer();
if(count!==='err'){
db.create_weekParents(id,function(info){
/* after everything completes */
dfd.resolve(input);
/* snip */
return dfd.promise;
}
Then you can do
getInput().then(function (input) {
doStuffWithInput(input ...
});
You should look into using the async library.
For your case you may want to look at using the waterfall pattern. The functions will be executed in series, with the result of each being passed as input to the next. From here, you can check the results of previous functions, etc.
You are also able to combine the different control flow structures in any way you want. (ie, parallel operations at one stage of a waterfall flow)