I am trying to make the following 'where will I get in x miles driving towards these target places' asynchronous function (which works well) also wait between executions. The google maps process has a limit on speed, and I have a few more than the 10 routes that can be plotted before 'Over_Query_limit' kicks in to plot on a map.
I know the terms of service (2500/day), I'm not breaking them.
It sits in a loop with the array of desired destinations(endPoints) from a central point(pt)
What is the syntax to make this happen please?
I have read a lot on this and other sites and can see that the function should be put in quotes but with an asynchronous call I cant see how.
you can see my poor attempt (commented out)
var delay=100;
for (var i = 0; i < endPoints.length; i++) {
//setTimeout(function() {
howfar(pt,endPoints[i],i,function(i,status,endPoint) {
//process results
});
//},delay;
}
function howfar(from,to,i,callback) {
//get the endpoint from the directions service
callback.call({},i,status,endPoint);
}
as Always thank you for looking and helping
If I understood your question correctly, you need to wait until the howfar function returns plus a fixed delay, and only then process the next endPoint in the array?
I typically setup an iterator function which schedules itself until there are no more items to be processed. Something like:
var delay = 100;
var i = 0;
//define a helper function
var measureNext = function() {
howfar(pt, endPoints[i], i, function(i,status,endPoint) {
//process results
//if there are still unprocessed items in the array, schedule
//the next after {delay} milliseconds
if(i++ < endPoints.length) {
setTimeout(measureNext, delay);
}
});
};
//start with the first argument
measureNext();
The exact syntax looks like this:
var delay = 100; // in milliseconds, 100 is a tenth of a second
setTimeout(function() {
howfar(pt,endPoints[i],i, function(i,status,endPoint) {
//process results
});
}, delay);
A quick Google would have turned this up, though.
Related
I've spent the last few days trying to tackle this issue and have read all sorts of solutions on StackOverflow and other sites.
I'm building a site that grabs XML data from an outside source and then gets more XML depending on the results to build a network graph. The problem is that I have to essentially wait until this loop of AJAX calls (which may loop into more AJAX calls) is finished before drawing.
I don't know if this just has an especially high cognitive load, but it really has me stumped.
My Code:
function cont(outerNodes) {
for (var i = 0; i < outerNodes.length; i++) {
var node = outerNodes.pop();
getXML(node["label"], node["id"]);
}
// I want the code to wait until loop is done, and then draw.
draw(nodes, edges);
}
function getXML(term, fromId) {
var url = someURL;
$.ajax({
url: url,
dataType: "xml",
success: function(result) {
var outerNodes = process(result, fromId, term);
cont(outerNodes);
}
});
}
Note: I understand I may be completely misunderstanding JavaScript synchronicity here, and I very likely am. I have used callbacks and promises successfully in the past, I just can't seem to wrap my head around this one.
If I have not been totally clear, please let me know.
I did try implementing a counter of sorts that is incremented in the process() function, like so:
if (processCount < 15) {
for (var i = 0; i < outerNodes.length; i++) {
var node = outerNodes.pop();
getXML(node["label"], node["id"]);
}
} else {
draw(nodes, edges);
}
However, this ended up with several draw() calls which made my performance abysmal.
There are nice new well-supported APIs and language constructs we can use. The Fetch API, await, and for...of loops.
The Fetch API uses Promises. Promises can be awaited. The for...of loop is aware of await and won't continue the loop until the await has passed.
// Loop through, one-at-a-time
for (const node of outerNodes) {
// Make the HTTP request
const res = await fetch(someUrl);
// Do something with the response here...
}
Don't forget a try/catch (which also works with await), and check res.ok.
Brad's answer changes the code to by synchronious and to me that defeats the purpose. If you are constantly waiting on all request to be finished then it could take a while, while normal browsers can handle multiple requests.
The problem you have in your original questions is with scope. Since each call to cont(outerNodes) will trigger it's own scope, it has no idea what are calls are doing. So basically if you call cont(outerNodes) twice, each call will handle it's own list of outernodes and then call draw.
The solution is to share information between the different scopes, which can be done with one, but preferably two global variables: 1 to track active processes and 1 to track errors.
var inProcess = 0;
var nrErrors = 0;
function cont(outerNodes) {
//make sure you have outerNodes before you call outerNodes.length
if (outerNodes) {
for (var i = 0; i < outerNodes.length; i++) {
var node = outerNodes.pop();
inProcess++; //add one more in process
getXML(node["label"], node["id"]);
}
}
//only trigger when nothing is in proces.
if (inProcess==0) {
// I want the code to wait until loop is done, and then draw.
draw(nodes, edges);
}
}
function getXML(term, fromId) {
var url = someURL;
$.ajax({
url: url,
dataType: "xml",
success: function(result) {
var outerNodes = process(result, fromId, term);
inProcess--; //one is done
cont(outerNodes);
},
error: function() {
inProcess--; //one is done
nrErrors++; //one more error
cont(null); //run without new outerNodes, to trigger a possible draw
}
});
}
Please note that I track nrErrors but dont do anything with it. You could use it to stop further processing all together or warn the user that the draw is not complete.
[important] Keep in mind that this works in javascript because at best it mimics multithreading. That means the the call to inProcess--; and then right after cont(outerNodes); is always execute directly after eachother.
If you would port this to a true multithreading environment, it could very well be that another scope/version of cont(null); would cut in between the two lines and there would still be multiple draws.
The best way to solve this question should be using either promise or callback.
If you really want to avoid promise or callback(Although i don't know why...)
You can try with a counter.
let processCount = 0;
// Increasing the processCount in getXML callback method
function getXML(term, fromId) {
var url = someURL;
$.ajax({
url: url,
dataType: "xml",
success: function(result) {
processCount++;
var outerNodes = process(result, fromId, term);
cont(outerNodes);
}
});
}
for (var i = 0; i < outerNodes.length; i++) {
var node = outerNodes.pop();
getXML(node["label"], node["id"]);
}
while (processCount < outerNodes.length) {
// do nothing, just wait'
}
draw(nodes, edges);
If after testing it many times, you know that it will never take more than say 5 seconds... you can use a setTimeout.
function cont(outerNodes) {
for (var i = 0; i < outerNodes.length; i++) {
var node = outerNodes.pop();
getXML(node["label"], node["id"]);
}
// Display a 5 second progress bar here
setTimeout(function(){ draw(nodes, edges); },5000);
}
There's a async call I'm making that queries a database on a service, but this service has a limit of how many it can output at once, so I need to check if it hit its limit through the result it sends and repeat the query until it doesn't.
Synchronous mockup :
var query_results = [];
var limit_hit = true; #While this is true means that the query hit the record limit
var start_from = 0; #Pagination parameter
while (limit_hit) {
Server.Query(params={start_from : start_from}, callback=function(result){
limit_hit = result.limit_hit;
start_from = result.results.length;
query_result.push(result.results);
}
}
Obviously the above does not work, I've seen some other questions here about the issue, but they don't mention what to do when you need each iteration to wait for the last one to finish and you don't know before hand the number of iterations.
How can I turn the above asynchronous? I'm open to answers using promise/deferred-like logic, but preferably something clean.
I can probably think of a monstruous and horrible way of doing this using waits/timeouts, but there has to be a clean, clever and modern way to solve it.
Another way is to make a "pre-query" to know the number of features before hand so you know the number of loops, I'm not sure if this is the correct way.
Here we use Dojo sometimes, but the examples I found does not explain what to do when you have an unknown amount of loops https://www.sitepen.com/blog/2015/06/10/dojo-faq-how-can-i-sequence-asynchronous-operations/
although many answers already, still I believe async/await is the cleanest way.
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/async_function
and you might need babel
https://babeljs.io/
JS async logic syntax changed from callback to promise then to async/await, they all do the same thing, when callback nests a lot we need something like a chain, then promise come, when promise goes in loop, we need something make the chain more plain more simple, then async/await come. But not all browsers support the new syntax, so babel come to compile new syntax to old syntax, then you can always code in new syntax.
getData().then((data) => {
//do something with final data
})
async function getData() {
var query_results = [];
var limit_hit = true;
var start_from = 0;
//when you use await, handle error with try/catch
try {
while (limit_hit) {
const result = await loadPage(start_from)
limit_hit = result.limit_hit;
start_from = result.results.length;
query_result.push(result.results);
}
} catch (e) {
//when loadPage rejects
console.log(e)
return null
}
return query_result
}
async function loadPage(start_from) {
//when you use promise, handle error with reject
return new Promise((resolve, reject) => Server.Query({
start_from
}, (result, err) => {
//error reject
if (err) {
reject(err)
return
}
resolve(result)
}))
}
If you want to use a loop then I think there is no (clean) way to do it without Promises.
A different approach would be the following:
var query_results = [];
var start_from = 0;
funciton myCallback(result) {
if(!result) {
//first call
Server.Query({ start_from: start_from}, myCallback);
} else {
//repeated call
start_from = result.results.length
query_result.push(result.results);
if(!result.limit_hit) {
//limit has not been hit yet
//repeat the query with new start value
Server.Query({ start_from: start_from}, myCallback);
} else {
//call some callback function here
}
}
}
myCallback(null);
You could call this recursive, but since the Query is asynchronous you shouldn't have problems with call stack limits etc.
Using promises in an ES6 environment you could make use of async/await. Im not sure if this is possible with dojo.
You don't understand callbacks until you have written a rate limiter or queue ;) The trick is to use a counter: Increment the counter before the async request, and decrement it when you get the response, then you will know how many requests are "in flight".
If the server is choked you want to put the item back in the queue.
There are many things you need to take into account:
What will happen to the queue if the process is killed ?
How long to wait before sending another request ?
Make sure the callback is not called many times !
How many times should you retry ?
How long to wait before giving up ?
Make sure there are no loose ends ! (callback is never called)
When all edge cases are taken into account you will have a rather long and not so elegant solution. But you can abstract it into one function! (that returns a Promise or whatever you fancy).
If you have a user interface you also want to show a loading bar and some statistics!
You must await for the server response every time. Here a encapsulated method
var query = (function(){
var results = [];
var count = 0;
return function check(fun){
Server.Query({ start_from: count}, function(d){
count = d.results.length;
results.push(d.results);
if (d.limit_hit && fun) fun(results);
else check(fun);
});
};
})();
// Call here
var my_query = query(function(d){
// --> retrive all data when limit_hit is true)
});
You can use a generator function Generators to achieve this
For POC:
some basics
- You define a generator with an asterick *
- it exposes a next function which returns the next value
- generators can pause with yield statement internally and can resume externally by calling the next()
- While (true) will ensure that the generator is not done until limit has reached
function *limitQueries() {
let limit_hit = false;
let start_from = 0;
const query_result = [];
while (true) {
if (limit_hit) {break;}
yield Server.Query(params={start_from : start_from},
callback=function* (result) {
limit_hit = result.limit_hit;
start_from = result.results.length;
yield query_result.push(result.results);
}
}
}
So apparently, the generator function maintains its own state. Generator function exposes two properties { value, done } and you can call it like this
const gen = limitQueries();
let results = [];
let next = gen.next();
while(next.done) {
next = gen.next();
}
results = next.value;
You might have to touch your Server.Query method to handle generator callback. Hope this helps! Cheers!
I have an array and i need to send values of array to webservice through http post request one by one . For the node.js , i'm using "async" package to do that for ex: async.eachSeries doing it well , how can i do that same thing for angular.js , my normal async code;
//this code sends all queries of array (maybe 5.000 request at same time , it is hard to process for webservice :=) ) at same time and wait for all responses.
//it works but actually for me , responses should wait others at end of loop should work one by one
//like async.eachSeries module!
for (var i = 0; i < myArr.lenght; i++) {
(function (i) {
var data = {
"myQuery": myArr[i].query
};
$http.post("/myServiceUrl", data).success(function (result) {
console.log(result);
});
})(i);
}
Both Matt Way and Chris L answers Correct , you can investigate Chris's answer for understanding about async to sync functions in for loops.
You can use $q to create a similar requirement by chaining promises together. For example:
var chain = $q.when();
angular.forEach(myArr, function(item){
chain = chain.then(function(){
var data = {
myQuery: item.query
};
return $http.post('/myServiceUrl', data).success(function(result){
console.log(result);
});
});
});
// the final chain object will resolve once all the posts have completed.
chain.then(function(){
console.log('all done!');
});
Essentially you are just running the next promise once the previous one has completed. Emphasis here on the fact that each request will wait until the previous one has completed, as per your question.
function logResultFromWebService(value)
{
$http.post("/myServiceUrl", value).success(console.log);
}
angular.forEach(myArray, logResultFromWebService);
If I understand your question correctly. You want to run a for loop in a synchronized manner such that the next iteration only occurs once the previous iteration is completed. For that, you can use a synchronized loop/callbacks. Especially if the order matters.
var syncLoop = function (iterations, process, exit) {
var index = 0,
done = false,
shouldExit = false;
var loop = {
next: function () {
if (done) {
if (shouldExit && exit) {
return exit(); // Exit if we're done
}
}
// If we're not finished
if (index < iterations) {
index++; // Increment our index
process(loop); // Run our process, pass in the loop
// Otherwise we're done
} else {
done = true; // Make sure we say we're done
if (exit) exit(); // Call the callback on exit
}
},
iteration: function () {
return index - 1; // Return the loop number we're on
},
break: function (end) {
done = true; // End the loop
shouldExit = end; // Passing end as true means we still call the exit callback
}
};
console.log('running first time');
loop.next();
return loop;
}
For your particular implementation:
syncLoop(myArray.length, function (loop) {
var index = loop.iteration();
var data = {
"myQuery": myArray[index].query
};
$http.post("/myServiceUrl", data).success(function (result) {
console.log(result);
loop.next();
});
}, function () {
console.log('done');
});
If you intend on doing something with the data once returned (such as perform calculations) you can do so with this method because you will return the data in a specified order.
I implemented something similar in a statistical calculation web app I built.
EDIT:
To illustrate the problem I had when using $q.when I have set up a fiddle. Hopefully this will help illustrate why I did this the way I did.
https://jsfiddle.net/chrislewispac/6atp3w8o/
Using the following code from Matt's answer:
var chain = $q.when(promise.getResult());
angular.forEach(myArr, function (item) {
chain = chain.then(function () {
$rootScope.status = item;
console.log(item);
});
});
// the final chain object will resolve once all the posts have completed.
chain.then(function () {
console.log('all done!');
});
And this fiddle is an example of my solution:
https://jsfiddle.net/chrislewispac/Lgwteone/3/
Compare the $q version to my version. View the console and imagine those being delivered to the user interface for user intervention in the process and/or performing statistical operations on the sequential returns.
You will see that it does not sequentially give the numbers 1,2,3,4 etc. either in the console or in the view in Matt's answer. It 'batches' the responses and then returns them. Therefore, if step 3 is not to be run depending on the response in step 2 there is not, at least in the answer provided, a way to break out or explicitly control the synchronous operation here. This presents a significant problem when attempting to perform sequential calculations and/or allow the user to control break points, etc.
Now, I am digging through both the $q libraries and the Q library to see if there is a more elegant solution for this problem. However, my solution does work as requested and is very explicit which allows me to place the function in a service and manipulate for certain use cases at my will because I completely understand what it is doing. For me, that is more important than using a library (at least at this stage in my development as a programmer and I am sure there are lots of other people at the same stage on StackOverflow as well).
If the order doesn't matter in which they are sent
var items = [/* your array */];
var promises = [];
angular.forEach(items, function(value, key){
var promise = $http.post("/myServiceUrl", { "myQuery": value.query });
promises.push(promise);
});
return $q.all(promises);
I have an array which is filled asynchronous and contains 28 items. I want to wait until the array is filled with all items.
function checkIfFinished(){
return(Results.length >= 28);
}
var isfinished = false;
while(isfinished){
if(checkIfFinished()){
returnResults();
isfinished = true;
}
else
//Wait 100ms
}
Well, but in Javascript there is no wait function! I tried it with setTimeout, but I don't know how to insert it... I just get errors with too much recursion and stuff :D
Thank you!
Try:
var timeout = setInterval(function() {
if(checkIfFinished()) {
clearInterval(timeout);
isFinished = true;
}
}, 100);
This will call your check-function every 100 ms until checkIfFinished() gives true back to you.
If you're using jQuery 1.5+, this sounds like a perfect opportunity to use deferred objects and promises in your code. I'm assuming that you're using AJAX calls to populate your array.
In a nutshell, something like this should work for you:
$(function() {
var $ajaxcalls = [],
myArray = [];
// set up all the ajax calls that will populate my array
for(var i=0; i < 28; i++) {
$ajaxcalls[i] = $.ajax({
url : 'http://your.domain.com/blah',
data : i
}).success(function(m) {
myArray.push(m);
});
}
// this will setup the promise ---
// what will run when all 28 AJAX calls complete?
$.when.apply(null, $ajaxcalls).then(function() {
returnResults();
});
});
I've written about this some time back as well. I really think it's a nifty feature / concept that can be really powerful when used correctly. Javascript timers and schedules should work as well, but they can be unwieldy and may result in a bit of wait time before the actual completing logic fires.
I have a LoadingStatus Function that has two options SHOW or HIDE.
The Show triggers to display when the JQUERY POST is made, the HIDE happens after the RESPONSE comes back.
The issue I'm having is that sometimes this happens so fast that it makes for a bad experience. What I thought about doing was putting in a JavaScript PAUSE, but if the POST takes a while to respond it will take even longer because of the PAUSE.
How can I have my SHOW HIDE function work together, to make sure at minimum the SHOW was displayed to the user for at least 1/2 second?
function saveBanner (action) {
if (action == 'show') {
// Display the AJAX Status MSG
$("#ajaxstatus").css("display","block");
$("#msg").text('Saving...');
}
else if (action == 'hide') {
$("#ajaxstatus").css("display","none");
$("#msg").text('');
}
};
Thanks
In your ajax success callback, you can put the hide command in a setTimeout() for 1500 miliseconds:
success: function(results) {
setTimeout(function(){
saveBanner("hide");
}, 1500);
}
Of course that would merely add 1.5 seconds onto however long the process itself took. Another solution would be to record the time the process started, with the Date object. Then, when the callback takes place, record that time and find the difference. If it's less than a second and a half, set the timeout for the difference.
/* untested */
var start = new Date();
success: function(results) {
var stop = new Date();
var difference = stop.getTime() - start.getTime();
difference = (difference > 1500) ? difference : 1500 ;
setTimeout(function(){
saveBanner("hide");
}, difference);
}
You can perform this math either inside your callback, or within the saveBanner() function itself, within the show portion you would set the starting time, within the hide() portion you would check the difference and set the setTimeout().
You can use setTimeout/clearTimeout to only show the status when the response takes longer than a set amount of time to load.
Edit:
Some untested code:
var t_id = 0;
function on_request_start()
{
t_id = setTimeout(show_message, 1000);
}
function on_request_completed()
{
clearTimeout(t_id);
hide_message();
}
The JQuery handlers should look something like the above. The message will not be shown if you receive a reply in less than a second.
var shownTime;
function saveBanner (action) {
if (action == 'show') {
// Display the AJAX Status MSG
$("#ajaxstatus").css("display","block");
$("#msg").text('Saving...');
shownTime = new Date().getTime();
}
else if (action == 'hide') {
var hideIt = function() {
$("#ajaxstatus").css("display","none");
$("#msg").text('');
};
var timeRemaining = new Date().getTime() - shownTime - 1500;
if (timeRemaining > 0) {
setTimeout(hideIt, timeRemaining);
else {
hideIt();
}
}
};
As of jQuery 1.5, you are able to extend the $.ajax functionality by using prefilters. I wanted a similar experience where a message was shown a minimum amount of time when an ajax call is made.
By using prefilters, I can now add a property to the ajax call named "delayedSuccess" and pass it a time in milliseconds. The time that is passed in is the minimum amount of time the ajax call will wait to call the success function. For instance, if you passed in 3000 (3 seconds) and the actual ajax call took 1.3 seconds, the success function would be delayed 1.7 seconds. If the original ajax call lasted more than 3 seconds, the success function would be called immediately.
Here is how I achieved that with an ajax prefilter.
$.ajaxPrefilter(function (options, originalOptions, jqXHR) {
if (originalOptions.delaySuccess && $.isFunction(originalOptions.success)) {
var start, stop;
options.beforeSend = function () {
start = new Date().getTime();
if ($.isFunction(originalOptions.beforeSend))
originalOptions.beforeSend();
};
options.success = function (response) {
var that = this, args = arguments;
stop = new Date().getTime();
function applySuccess() {
originalOptions.success.apply(that, args);
}
var difference = originalOptions.delaySuccess - (stop - start);
if (difference > 0)
setTimeout(applySuccess, difference);
else
applySuccess();
};
}
});
I first check to see if the delaySuccess and success options are set. If they are, I then override the beforeSend callback in order set the start variable to the current time. I then override the success function to grab the time after the ajax call has finish and subtract the difference from the original delaySuccess time. Finally, a timeout is set to the computed time which then calls the original success function.
I found this to be a nice way to achieve this effect and it can easily be used multiple times throughout a site.