Javascript Scoping, Inline functions, and asynchronous operations - javascript

I'm working on a geoprocessing web application. My application will provide users with a specific set of options, the user will provide some data, and then I will process the data on the server and finally return the results. If it matters, I am using the CMV http://docs.cmv.io/en/1.3.3/ as a framework and trying to build my own plugin, but I suspect my problems are more general JS problems. Here is a pseudocode sample (note that this is pseudocode and not my actual code, which is a mess at the moment):
initializeTool: function() {
//here I am able to access my map object through this.map
//and I need it for my output
on(dom.byId("mybutton"), "click", processInput);
}
processInput: function() {
//pull user data from webpage
var userData, queries;
//launch query for all data
for(var i in userData){
queries[i] = query(userData[i]);
}
//deferredlist is from Dojo, doc here: http://dojotoolkit.org/api/?qs=1.10/dojo/DeferredList
new DeferredList(queries).then(function (results) {
//iterate over query responses and perform work
for(var i in queries){
//peform some synchronus operations
}
//and now we're done! but how do I get to my output?
}
}
The desired output in this case is a group of objects that have had various operations done on them, but are only accessible in the scope of the then() block and the inline function. My problem is that the output I am trying to use is only in the scope of the initialize function. I'm not sure what the best way to get my processed data to where I want it to be. This is a problem because the processed data is geometry information - it isn't very human readable as text, so it needs to be displayed on a map.
I've been pouring over JS scoping and looking at references to try and figure out what my issue is, but I seriously cannot figure it out.

One of the main points of promises is that then returns a promise for whatever is eventually returned inside its onFulfill handler. This is what enables you to get the outcome out of your processInput() function and into the world outside it.
So you can (and should) do this:
function processInput() {
//pull user data from webpage
var userData;
//launch query for all data
return Promise.all(userData.map(query))
.then(function (results) {
var theResult;
//iterate over query responses and perform work
results.forEach(function (result) {
//peform some synchronus operations and determine theResult
});
return theResult;
});
}
processInput().then(function (theResult) {
// do something with theResult
});

Related

Using promises in cloud code

Here is some code that partially works and partially doesn't.
I tried keep(as much as possible) only the parts relevant to my question.
See my concerns after the code.
Parse.Cloud.define
("myCloudFunction", function(request, response)
{
var recordTypeArray,className,recordListQuery,resultDictionary;
recordTypeArray = ["LT1","LT2","LT3","RT1","RT2","RT3"];
resultDictionary = [];
console.log("Trace-One");
watchFunction(recordTypeArray,0,resultDictionary).then
(function(resRcd) {
console.log("Trace-Two");
response.success(resultDictionary);
});
console.log("Trace-Three");
});
function watchFunction(typeArray,typeNumber,resDico)
{
var className,recordListQuery;
className = "AA_".concat(typeArray[typeNumber]).concat("_ZZ");
recordListQuery = new Parse.Query(className);
return (recordListQuery.find().then
(function(resRcd) {
// Here some problemless code.
if (typeNumber++==typeArray.length) return promise(function(){});
return watchFunction(typeArray,typeNumber,resDico)
})
);
}
I am doing something wrong concerning the way I handle promises, but I don't know what.
I want to see Trace-One, then watchFunction do its job (this part actually works fine) an finally see Trace-Two, before performing response.success.
But what happens is I see Trace-One, then I see Trace-Three, then I can see in the log that watchFunction has done its job as it should. And I never see Trace-Two.
And as one could expect I get a message complaining that success/error was not called
So why am I not seeing Trace-Two and jumping to Trace-Three?
I presume I am not returning a promise from somewhere correctly.
I wish someone can point out where my mistake is.
It looks like you want watchFunction to perform several queries, one for each class name that can be derived from typeArray. But if those queries succeed, watchFunction is guaranteed to crash by indexing typeArray out of bounds. The function also drops the results, assigning them to a never-referenced dummy argument.
A simpler way to produce those several queries is to map each class name to a promise to query that class. Parse.Promise.when() will run all of queries and be fulfilled with a (var arg) array of arrays containing the results. (Parse.Query.or() ought to do the same, combining the results in a single array).
So, fix watchFunction:
// create a collection of promises to query each class indicated
// by type array, return a promise to run all of the promises
function watchFunction(typeArray) {
var promises = [];
for (i=0; i<typeArray.length; ++i) {
var className = "AA_".concat(typeArray[i]).concat("_ZZ");
var recordListQuery = new Parse.Query(className);
promises.push(recordListQuery.find());
}
return Parse.Promise.when(promises);
}
The cloud function should also call response error, and can be cleaned up a little bit...
Parse.Cloud.define("myCloudFunction", function(request, response) {
var recordTypeArray = ["LT1","LT2","LT3","RT1","RT2","RT3"];
console.log("Trace-One");
watchFunction(recordTypeArray).then(function() {
console.log("Trace-Two");
response.success(arguments);
}, function(error) {
console.log("Trace-Two (error)");
response.error(error);
});
console.log("Trace-Three");
});
You should expect to see Trace-One, Trace-Three, Trace-Two in the logs, since the "Trace-Two" logs happen after the queries finish.

Javascript callbacks gets processed faster than others

I'm using the javascript sdk plugin for facebook to create a feed on my webpage.
The problem is that sometimes during load the feed gets unordered, even if i have setup a callback chain.
I think it gets unordered because sometimes the "second" async call gets processed faster than the "first" async call.
This is the first time i've been using callbacks, am i doing it right?
How can i solve the feed gets unordered if some calls finish faster than others?
The code below is only the relevant code and is under working status.
function initFeed(){
FB.api('/{id}/feed', function(response){
var feedArray = response.data;
$.each(feedArray, function(){
var $this = $(this)[0]; //Status Object for single Status in Feed
setStatus($this, processStatus); //processStatus is function defined below
});
});
}
function setStatus(statusObject, callbackProcessStatus){
FB.api("/{personId}?fields=id,link,name,picture",
function (response) {
var html = /* Generates html based from statusObject and response */
callbackProcessStatus(html);
});
}
function processStatus(html){
$('#fb-status-wrapper').append(html);
}
(was uncertain on the title of this post, please edit if you think it is not descriptive enough)
Best regards
This is a somewhat common problem with parallel async calls. The simplest solution requires promises. I recommend the Bluebird promise library, but most will do fine.
var fbApi = function(url){
return new Promise(function(resolve, reject){
FB.api(url, function(resp){ resolve(resp); });
});
}
function setStatus(statusObject){
return fbApi("/{personId}?fields=id,link,name,picture")
.then(function(response){
var html = ...;
return html;
});
}
function getFeedItemPromises(){
return fbApi("/{id}/feed").then(function(response){
return response.data.map(function(item){
});
});
}
Depending on your needs, initFeed could be one of these. The first renders the feed when all items are available, and the second renders it when each item is available, but enforces the order.
function initFeed(){
return Promise.all(getFeedItemPromises())
.then(function(itemsHtml){
// append all of the items at once
$('#fb-status-wrapper').append(itemsHtml.join("\n"));
});
}
Or this which ensures the order, but eagerly appends items to the feed, after all previous items have been added.
function initFeed(){
function renderItem(html){
$('#fb-status-wrapper').append(html);
}
// reduce can be used to chain promises in sequence
return getFeedItemPromises().reduce(function(p, nextPromise){
return p.then(function(){ return nextPromise })
.then(renderItem);
}, Promise.resolve())
}
An alternative would be to create a div for each item which acts as a placeholder, keep those in an array, and fill them in when each resolves. This works especially well if you know the height of the items beforehand, and fade them in when they load. From a UX perspective, this is the best in my opinion.
I would not recommend the above if you don't know the heights of items, as it'll cause headache inducing shifting of items as new ones are inserted.
Indeed you cannot rely on the order in which the requests will finish. The only way to be sure, is to only call the second one if the first one is done. But that will slow down the loading quite a lot.
Another possibility is to remember for each request which one it is, and insert the items in the right order (insert before a 'later' one, even if that one was received earlier).
I think the easiest way to do that, is to make placeholders for the items inside the each loop, so the placeholders are inserted in the right order. When the requests return, you just place the responses in the right placeholder.
It could look somewhat like this. 2 extra lines and a couple of tiny changes. I couldn't test this without the API, but I hope you get the idea.
function initFeed(){
FB.api('/{id}/feed', function(response){
var feedArray = response.data;
$.each(feedArray, function(index){
var $this = $(this)[0]; //Status Object for single Status in Feed
// Make a container per item inside the wrapper.
var $itemContainer = $('<div></div>');
$('#fb-status-wrapper').append($itemContainer);
// Pass the container to the api function.
setStatus($this, processStatus, $itemContainer); //processStatus is function defined below
});
});
}
function setStatus(statusObject, callbackProcessStatus, $container){
FB.api("/{personId}?fields=id,link,name,picture",
function (response) {
var html = /* Generates html based from statusObject and response */
// Pass the item place holder/container to the processing procedure.
callbackProcessStatus(html, $container);
});
}
function processStatus(html, $container){
$container.append(html);
}

Extracting values from USGS real time water service

There must be something simple I am missing, but alas, I do not know what I do not know. Below is the code I have thus far for trying to get current streamflow conditions from the USGS.
// create site object
function Site(siteCode) {
this.timeSeriesList = [];
this.siteCode = siteCode;
this.downloadData = downloadData;
this.getCfs = getCfs;
// create reference to the local object for use inside the jquery ajax function below
var self = this;
// create timeSeries object
function TimeSeries(siteCode, variableCode) {
this.variableCode = variableCode;
this.observations = [];
}
// create observation object
function TimeSeriesObservation(stage, timeDate) {
this.stage = stage;
this.timeDate = timeDate;
}
// include the capability to download data automatically
function downloadData() {
// construct the url to get data
// TODO: include the capability to change the date range, currently one week (P1W)
var url = "http://waterservices.usgs.gov/nwis/iv/?format=json&sites=" + this.siteCode + "&period=P1W&parameterCd=00060,00065"
// use jquery getJSON to download the data
$.getJSON(url, function (data) {
// timeSeries is a two item list, one for cfs and the other for feet
// iterate these and create an object for each
$(data.value.timeSeries).each(function () {
// create a timeSeries object
var thisTimeSeries = new TimeSeries(
self.siteCode,
// get the variable code, 65 for ft and 60 for cfs
this.variable.variableCode[0].value
);
// for every observation of the type at this site
$(this.values[0].value).each(function () {
// add the observation to the list
thisTimeSeries.observations.push(new TimeSeriesObservation(
// observation stage or level
this.value,
// observation time
this.dateTime
));
});
// add the timeSeries instance to the object list
self.timeSeriesList.push(thisTimeSeries);
});
});
}
// return serialized array of cfs stage values
function getCfs() {
// iterate timeseries objects
$(self.timeSeriesList).each(function () {
// if the variable code is 00060 - cfs
if (this.variableCode === '00060') {
// return serialized array of stages
return JSON.stringify(this.observations);
}
});
}
}
When I simply access the object directly using the command line, I can access individual observations using:
> var watauga = new Site('03479000')
> watauga.downloadData()
> watauga.timeSeriesList[0].observations[0]
I can even access all the reported values with the timestamps using:
> JSON.stringify(watauga.timeSeriesList[0].observations)
Now I am trying to wrap this logic into the getCfs function, with little success. What am I missing?
I don't see anything in the code above that enforces the data being downloaded. Maybe in whatever execution path you're using to call getCfs() you have a wait or a loop that checks for the download to complete prior to calling getCfs(), but if you're simply calling
site.downloadData();
site.getCfs()
you're almost certainly not finished loading when you call site.getCfs().
You'd need to do invoke a callback from within your success handler to notify the caller that the data is downloaded. For example, change the signature of Site.downloadData to
function downloadData(downloadCallback) {
// ...
Add a call to the downloadCallback after you're finished processing the data:
// After the `each` that populates 'thisTimeSeries', but before you exit
// the 'success' handler
if (typeof downloadCallback === 'function') {
downloadCallback();
}
And then your invocation would be something like:
var watauga = new Site('03479000');
var downloadCallback = function() {
watauga.timeSeriesList[0].observations[0];
};
watauga.downloadData(downloadCallback);
That way, you're guaranteed that the data is finished processing before you attempt to access it.
If you're getting an undefined in some other part of your code, of course, then there may be something else wrong. Throw a debugger on it and step through the execution. Just bear in mind that interactive debugging has many of the same problems as interactively calling the script; the script has time to complete its download in the background before you start inspecting the variables, which makes it look like everything's hunky dory, when in fact a non-interactive execution would have different timing.
The real issue, I discovered through just starting over from scratch on this function, is something wrong with my implementation of jQuery.().each(). My second stab at the issue, I successfully used a standard for in loop. Here is the working code.
function getCfs() {
for (var index in this.timeSeriesList) {
if (this.timeSeriesList[index].variableCode === '00060'){
return JSON.stringify(this.timeSeriesList[index].observations);
}
}
}
Also, some of the stuff you are talking about #Palpatim, I definitely will have to look into. Thank you for pointing out these considerations. This looks like a good time to further investigate these promises things.

Do I need Web Workers for looping AJAX-requests?

Given: a php-script for parsing portions of data on a web-site. It parses about 10k products hence rather slow.
I need to make a web-frontend with html/css/js for it. I made a loop which makes ajax-requests and shows progress inforamtion. It uses syncronous ajax because it needs to wait until another request is done to perform another.
do {
var parseProductsActive = true;
var counter = 0;
myAjax('parseProducts.php?start='+counter, false, function(resp) {
if (resp[0]=='s') {
counter += Number(resp.substring(1));
parseProductsActive = false;
}
else {
counter += Number(resp);
}
self.postMessage(counter);
});
} while (parseProductsActive==true);
I'm doing it in a Web Worker because I'm afraid it's going to hang up the interface because of this endless loop and (a)synchronousness of ajax itself won't help to solve the prolem.
But when I tried to use ajax in a web worker I found it's hard though possible because jQuery doesn't work in a Web Worker at all. It uses DOM even for non-DOM operations and DOM isn't available in a Web Worker. And many developers doubt using Web Workers at all. I just wanted to ask if I am doing it right or wrong. Is there any more surface solutions to that I can't see?
You guessed right: a recursive callback is the way to do a bunch of asynchronous requests in sequence. It might look a bit like this:
var parseProductsActive = true;
var counter = 0;
//define the loop
function doNextAjax(allDone){
//Instead of just returning, an async function needs to
//call the code that comes after it explicitly. Receiving a callback
//lets use not hardcode what comes after the loop.
if(!parseProductsActive){
allDone();
}else{
//use async Ajax:
myAjax('parseProducts.php?start='+counter, true, function(resp) {
if (resp[0]=='s') {
counter += Number(resp.substring(1));
parseProductsActive = false;
}
else {
counter += Number(resp);
}
self.postMessage(counter);
doNextAjax(); // <---
});
}
//Start the loop
doNextAjax(function(){
console.log("the code that runs after the loop goes here")
});
//BTW, you might be able to get rid of the "parseProductsActive" flag with a small
// refactoring but I'm keeping the code as similar as possible for now.
//It would be kind of equivalent to writing your original loop using a break statement.
Yes, its ugly and verbose but ints the only way to do it in raw Javascript. If you want to write a more structured version that looks like a loop instead of something with tons of gotos, have a look at one of the async control flow libraries or one of the compilers that compiles extensions of Javaascript with async support back into regular JS with callbacks.

mysql/node.js how to make/fake synchronous?

My problem is as follows :
I have many Mysql requests to do in Node, and it's done asynchronously.
In the following example, I would like to wait for the checkExists function to finish one way or another (and populate my input variable) before the function doStuffWithInput starts. I don't see any other way than pasting doStuffWithInput multiple times in the various possible callbacks (after each 'input=keys;') ... I'm sure there is a better way though. Any ideas?
var input;
db.checkExists_weekParents(id,function(count){ //check table existence/number of rows
if(count!==='err'){ //if doesnt exist, create table
db.create_weekParents(id,function(info){
if(info!=='err'){ //as table is empty, create input from a full dataset
db.makeFull_weekParents(id,function(keys){
input = keys;
});
}
});
}else{ //if exists, check number of entries and create input keys as a subset of the full dataset
db.makeDiff_weekParents(id,function(keys){
if(keys.length!==0){
input = keys;
}else{ //if the table already has full dataset, we need to export and start again.
db.export_weekParents(id,function(info){
db.create_weekParents(id,function(info){
if(info!=='err'){
db.makeFull_weekParents(id,function(keys){
input = keys;
});
}
});
});
}
});
}
});
Once all this is done, we have lots of stuff to do (spawn child processes, more db operations, etc...)
doStuffWithInput(input,function(output){
//Tons of stuff here
console.log(output);
})
I really hope this is clear enough, I'll clarify if needed.
EDIT
Trying to rewrite using promises seems the best way to go, and I imagine it can be a great example for others like me struggling with pyramid of doom.
So far I have :
var Q = require('q');
function getInput(){
var dfd = Q.defer();
db.check_weekParents(id,function(count){
console.log('count '+count);
if(count==='err'){
db.create_weekParents(id,function(info){
if(info!=='err'){
console.log('created table');
db.makeDiff_weekParents(id,function(keys){
input = keys;
dfd.resolve(input);
});
}
});
}else{
db.makeDiff_weekParents(id,function(keys){
input=keys;
dfd.resolve(input);
});
}
});
return dfd.promise;
}
getInput().then(function (input) {
console.log(input);
});
It is magic!!
You can use promises rather than callbacks. There are many possibilities in node, and the mysql library you are using may even support them. For example with Q:
function getInput(){
var dfd = Q.defer();
if(count!==='err'){
db.create_weekParents(id,function(info){
/* after everything completes */
dfd.resolve(input);
/* snip */
return dfd.promise;
}
Then you can do
getInput().then(function (input) {
doStuffWithInput(input ...
});
You should look into using the async library.
For your case you may want to look at using the waterfall pattern. The functions will be executed in series, with the result of each being passed as input to the next. From here, you can check the results of previous functions, etc.
You are also able to combine the different control flow structures in any way you want. (ie, parallel operations at one stage of a waterfall flow)

Categories

Resources