I have a page with many html forms and I need to go through all of them, submit them and wait for the response, then grab the data.
The natural thing to do is to write a loop like this:
for (i = 0; i < win.document.forms.length; i++)
{
// Submit form i and wait for the response
}
But then the problem dawns on you: How do you wait for the response inside the loop? You can check whether the data is available, and if not? How do you kill time inside the loop? There is no sleep function in js, right?
To run in a loop and check for the data all the time would suck up all the system resources. You can't do that either. My current wisdom is that you need to exit the loop and terminate, but schedule the containing function for execution later on. When the function runs at some later time you must reenter the loop at the point where you left off and once again check for data available.
It's awkward to say the least. Am I missing something? Is there a better solution?
You can use just ajax asynchronous calls with callbacks so you are not blocking thread.
$("form").each(function (index, element) {
var form = $(element);
$.ajax({
url: form.attr('action'),
type: 'POST',
data: form.serialize(),
success: function (result) {
// ... Process the result ...
}
});
});
If you want to use for loop then you will need to also create outer function because variable defined in that loop will not be lexically scoped, so you can end up with submiting same form multiple times.
Related
I've got a tool in place which is splitting a large query into manageable chunks, then using a simple AJAX method to spit this out. The destination for the AJAX form is just a script which delegates some form data to a function, including which 'chunk' to process.
<script>
var passes = Math.ceil($max / $offset);
for (i = 0; i < passes; i++)
{
$.ajax({
type: 'POST', url: 'do.php?p=' + i, data: $('#form" . $i . "').serialize(),
success: function(data){
$('#update" . $i . "').append(data);
}
});
}
</script>
As this can iterate a few times, I was looking to execute a script for when the looping (i.e. the function itself) has finished.
As this isn't anything too snazzy, I thought it would be a simple case of adding if(i == passes -1) { alert('test');}if(i == passes -1) { alert('test');} to the end of the loop, like this:
for (i = 0; i < passes; i++) {
$.ajax({
type: 'POST', url: 'do.php?p=' + i, data: $('#form" . $i . "').serialize(),
success: function(data){
$('#update" . $i . "').append(data);
}
});
if(i == passes -1) { alert('test');}
}
....but it loads this as soon as the page loads, before the loop.
Likewise, adding a simple function after the loop acheives the same result, too.
I would have thought (but I'm quite fresh at JS) that it would complete a loop before attempting to execute the second instance of 'i', but it doesn't seem to do so - the page acts like all of the requests are sent instantly, completing the loop, executing the code, then allowing the functions within 'success' to echo back in their own time. This seems even more evident in that sometimes it will append the results for the second iteration of i before the first.
Questions...
1) Have I made an error in how I've constructed the loop, considering the purpose?
2) Why does the loop seem to execute code after the loop when it seems like it is still processing the loop itself?
What I'm trying to achieve
Each loop should perform a MySQL query, return the function's HTML output, then print it before moving on to the next. It does do this 99% correct, just with the occasional problem of it not always appending in order.
After all loops have completed and appended to the container, I would like to run some code to confirm that the operation is complete.
Many Thanks in advance, hope this is clear enough
This is a "Promise" based solution to your problem.
First, decompose each pass into a function that does one unit of work:
function makePass(i) {
return $.ajax({
type: 'POST', url: 'do.php?p=' + i, data: $('#form' + i).serialize()
}).then(function(data) {
$('#update' + i).append(data);
});
}
Then you can make a general purpose function that pseudo-recursively makes the desired number of calls to the supplied function, calling it each time with the current pass number, and finally returning a new "resolved" promise once every pass has been completed:
function makeNPasses(f, n) {
var i = 0;
return (function loop() {
if (i < n) {
return f(i++).then(loop);
} else {
return Promise.resolve();
}
})();
}
You can then register a handler to be invoked once everything is done:
var passes = Math.ceil($max / $offset);
makeNPasses(makePass, passes).then(function() {
console.log("All done!");
});
Javascript has an async flow.. It doesn't wait for the above request to get the data from somewhere... Instead it just keeps firing statements in a row.
To escape this, there are 3 options.
The ideal approach is to make 1 single http request to server and get all data in form of json array. This will be more efficient, easy and time-saving and also follows the best practices.
Make an async call of ajax. You will get the good information about it in this answer jQuery ajax success anonymous function scope. But again.. Callbacks are recommended and not doing async false. Instead .when() or .then() is easier.. because ultimately they too are callbacks.
Recursive Function can help you through such kind of tasks. It is a dirty approach because ES5 Doesn't let you iterate much deeper using recursive functions. But ES6 does allow it to be 10000 iterations. But its not a good approach. It increases overhead and bottleneck your page load.
AJAX calls are asynchronous and therefore they cannot succeed before the loop ends (as JavaScript is non blocking). In fact what you are wanting to do is to perform action after AJAX calls, not just after the loop (which is synchronous) so I would suggest either using Promise or chaining or aggregating success events like below:
var passes = Math.ceil($max / $offset);
for (i = 0; i < passes; i++) {
$.ajax({
type: 'POST',
url: 'do.php?p=' + i,
data: $('#form' + i).serialize(),
success: function(data){
$('#update' + i).append(data);
resolve();
},
error: reject
});
}
var resolved = 0;
function resolve() {
if(++resolved >= passes) {
alert('test');
}
}
function reject() {
alert('One of AJAX requests failed');
}
I'm not sure if this will actually be possible, since load() is an asynchronous method, but I need some way to basically Load several little bits of pages, one at a time, get some data included in them via JavaScript, and then send that over via Ajax so I can put it on a database I made.
Basically I get this from my page, where all the links I'll be having to iterate through are located:
var digiList = $('.2u');
var link;
for(var i=0;i<digiList.length;i++){
link = "http://www.digimon-heroes.com" + $(digiList).eq(i).find('map').children().attr('href');
So far so good.
Now, I'm going to have to load each link (only a specific div of the full page, not the whole thing) into a div I have somewhere around my page, so that I can get some data via JQuery:
var contentURI= link + ' div.row:nth-child(2)';
$('#single').load('grabber.php?url='+ contentURI,function(){
///////////// And I do a bunch of JQuery stuff here, and save stuff into an object
///////////// Aaaand then I call up an ajax request.
$.ajax({
url: 'insertDigi.php',
type: 'POST',
data: {digimon: JSON.stringify(digimon)},
dataType: 'json',
success: function(msg){
console.log(msg);
}
////////This calls up a script that handles everything and makes an insert into my database.
}); //END ajax
}); //END load callback Function
} //END 'for' Statement.
alert('Inserted!');
Naturally, as would be expected, the loading takes too long, and the rest of the for statement just keeps going through, not really caring about letting the load finish up it's business, since the load is asynchronous. The alert('Inserted!'); is called before I even get the chance to load the very first page. This, in turn, means that I only get to load the stuff into my div before I can even treat it's information and send it over to my script.
So my question is: Is there some creative way to do this in such a manner that I could iterate through multiple links, load them, do my business with them, and be done with it? And if not, is there a synchronous alternative to load, that could produce roughly the same effect? I know that it would probably block up my page completely, but I'd be fine with it, since the page does not require any input from me.
Hopefully I explained everything with the necessary detail, and hopefully you guys can help me out with this. Thanks!
You probably want a recursive function, that waits for one iteration, before going to the next iteration etc.
(function recursive(i) {
var digiList = $('.2u');
var link = digiList.eq(i).find('map').children().attr('href') + ' div.row:nth-child(2)';
$.ajax({
url: 'grabber.php',
data: {
url: link
}
}).done(function(data) {
// do stuff with "data"
$.ajax({
url: 'insertDigi.php',
type: 'POST',
data: {
digimon: digimon
},
dataType: 'json'
}).done(function(msg) {
console.log(msg);
if (i < digiList.length) {
recursive(++i); // do the next one ... when this is one is done
}
});
});
})(0);
Just in case you want them to run together you can use closure to preserve each number in the loop
for (var i = 0; i < digiList.length; i++) {
(function(num) { < // num here as the argument is actually i
var link = "http://www.digimon-heroes.com" + $(digiList).eq(num).find('map').children().attr('href');
var contentURI= link + ' div.row:nth-child(2)';
$('#single').load('grabber.php?url=' + contentURI, function() {
///////////// And I do a bunch of JQuery stuff here, and save stuff into an object
///////////// Aaaand then I call up an ajax request.
$.ajax({
url: 'insertDigi.php',
type: 'POST',
data: {
digimon: JSON.stringify(digimon)
},
dataType: 'json',
success: function(msg) {
console.log(msg);
}
////////This calls up a script that handles everything and makes an insert into my database.
}); //END ajax
}); //END load callback Function
})(i);// <-- pass in the number from the loop
}
You can always use synchronous ajax, but there's no good reason for it.
If you know the number of documents you need to download (you can count them or just hardcode if it's constant), you could run some callback function on success and if everything is done, then proceed with logic that need all documents.
To make it even better you could just trigger an event (on document or any other object) when everything is downloaded (e.x. "downloads_done") and listen on this even to make what you need to make.
But all above is for case you need to do something when all is done. However I'm not sure if I understood your question correctly (just read this again).
If you want to download something -> do something with data -> download another thing -> do something again...
Then you can also use javascript waterfall (library or build your own) to make it simple and easy to use. On waterfall you define what should happen when async function is done, one by one.
Here's my issue. I have a js function that performs an $.ajax call to fetch some data from a server. When it gets that data back, I need to pass control back to the browser in order to show an update to a div.
The js function is itself within a for loop, and I need to ensure that the for loop does not advance until the js function has updated the div and allowed the Browser to display that update, at which point the for loop advances and the js function (with its ajax call) is called again, continuing until the for loop test causes the loop to end.
I've tried many different approaches - callbacks, promises etc, but to date I can't seem to get a handle on ensuring that the loop doesn't advance until the js function gets its server data, updates the div, causes the browser to display that update and fully completes.
Here's a simple stripped-down version of the function:
function myFunction (email) {
var request = $.ajax( {
url: 'getit.php',
cache: false,
async: false,
method: "post",
timeout: 1000,
data: "requesttype=getemailname&email="+encodeURIComponent(email)
});
request.done(function(response) {
$("#myDiv").html(response);
});
}
and here's part of the js that calls it:
.....
var emailscount = emails.length;
for(var i=0;i<emailscount;i++) {
myFunction (emails[i]);
}
.....
So, my issues are:
1) myFunction must allow the browser to display the updated div html - I'm not sure how to achieve that?
2) the for loop should only proceed when myFunction has received the data back from the server, updated the div html, AND allowed the browser to display that div.
At the moment, I have set the $.ajax call async flag set to "false" to stop execution until the data comes back, but how do I ensure the browser displays the new div content, and that the for loop does not proceed to call myFunction again until the previous myFunction call fully completes?
Any help you can give me would be very welcome, as right now I can't get this all to work!
Sounds like you need a recursive function, not a for loop with synchronous ajax calls
(function myFunction(i) {
$.ajax({
url: 'getit.php',
method: "post",
timeout: 1000,
data: {
requesttype : 'getemailname',
email : emails[i]
}
}).done(function(response) {
$("#myDiv").html(response);
if (emails[++i]) myFunction(i); // continue when this one is done
});
})(0);
Thanks for everyone's help! I'm making good progress (including taking care of JQuery deprecations!) but have run into a further problem. As I need to hand control back to the browser in order to show the refreshed div as I recurse, I'm calling a setTimeout as follows:
var nextBitOfWork = function () {
return myFunction(email);
};
setTimeout(nextBitOfWork, 0);
where myFunction (which recurses) now returns a promise when it's done doing it's $.ajax call.
If I simply call:
return myFunction(email);
without the setTimeout function construct above, the promise is passed through and all my promises are captured and allow me to get the array output I need and everything works great. But without the setTimeout I don't get the browser refresh. Using it as above I get the div update refresh displaying, but seem to lose the promise and so the script continues and I don't get to fill the array I use to capture values as I recurse.
Any thoughts on how to make sure the setTimeout passes on the promise?
Thanks
I frequently need to load a function on a webpage after between two and five modest-sized data files have loaded. Let's say a maximum of 3MB of data split over a maximum of five files.
I try to optimize the load time by making all the AJAX calls at the same time and loading an initialize() function after they have all loaded, like so:
var data1, data2;
$(document).ajaxStop(function() {
$(this).unbind("ajaxStop"); //prevent running again when other calls finish
initialize();
});
$.ajax({
url: url_to_data_1,
success: function (d) {
data1 = d;
}
});
$.ajax({
url: url_to_data_2,
success: function (d) {
data2 = d;
}
});
function initialize() { /* do something with data1 and data2 */ }
But I'm tired of pasting this in every time, so I want a function like this:
function multi_Ajax(urls, callback) {
var data = {};
//call init() after both the list of prayers and the word concordance index load
$(document).ajaxStop(function() {
$(this).unbind("ajaxStop"); //prevent running again when other calls finish
callback(data);
});
for (var c = 0; c < urls.length; c += 1) {
//data files
$.ajax({
url: urls[c],
dataType: "json",
success: function (d) {
data[urls[c]] = d; console.log("Loaded " + urls[c]);
}
});
}
}
This does not work, of course, since the ajax calls do not exist for ajaxStop to catch. But I do not understand how ajaxStop works well enough to get much further. Thank you!
I'm not sure why your second attempt wouldn't work. According to the documentation, whenever an ajax request completes, jquery will check if there are any other requests that are still outstanding and fires ajaxStop if there are none. Calling $.ajax in a loop shouldn't be any different than hardcoding each call, as far as I can tell.
However, as Barmar suggested in the comments, $.when seems like a cleaner way to do what you want to do. Here's an example of how to pass an array (which you can populate in a loop) to $.when: Pass in an array of Deferreds to $.when()
Using $.when seems cleaner than $.ajaxStop because if someone later comes along and adds an unrelated ajax request somewhere before or after your loop, that would interfere with when ajaxStop triggers. $.when allows you to explicitly say which ajax requests you want to wait for.
EDIT: Here's a fiddle showing ajaxStop working for multiple calls issued in a loop: http://jsfiddle.net/zYk5W/
It looks like the reason this wasn't working for you has nothing to do with .ajax or .ajaxStop; instead it's a scope issue in your success callback. Your success callback closes over c, but this is the same c that the outer (loop) scope uses. By the time any of the success callbacks runs, the for loop has completed and c has been incremented to urls.length. Thus every time your success callback runs, urls[c] is undefined. See the fiddle I linked or JavaScript closure inside loops – simple practical example for an example of how to give each success callback its own c in a scope separate from the loop's c.
I'm creating a script that performs several functions and I want to update the user as the functions are completed. I have nested $.ajax() calls with each subsequent call in the previous call's success block.
There are a total of 4 calls made for each loop. Let's call them scan_1 through scan_4. The success block of scan_1 calls scan_2 and so on down the chain.
For example, let's say I'm looping over 3 objects. I want the process to go like this:
Loop 1
scan_1
scan_2
scan_3
scan_4
Loop 2
scan_1
scan_2
scan_3
scan_4
Loop 3
scan_1
scan_2
scan_3
scan_4
The problem is that it's running through all the scan_1 calls first. I must be missing something, but I can't seem to figure it out. Any advice would be much appreciated.
For reference, here is a snippet of scan_1 (irrelevant stuff snipped):
for(var i = 1; i <= 3; i++)
{
$.ajax({
type: 'GET',
url: url,
data: 'do=scan&step=1&' + string,
dataType: 'json',
success: function (result)
{
if(result.proceed == 'true')
{
$('#scan_progress').append(result.message);
scan_2();
}
else
{
$('#scan_progress').append(result.message);
}
}
});
}
Thoughts?
Thanks in advance.
Sounds like you need to use jQuery deferred. It basically allows you to chain multiple event handlers to the jQuery Ajax object and gives you finer control over when the callbacks are invoked.
Further reading:
http://msdn.microsoft.com/en-us/scriptjunkie/gg723713
http://www.erichynds.com/jquery/using-deferreds-in-jquery/
It's asynchronous - the "success" fires sometime in the future. The script does not wait for it to respond. Since you're firing off three requests in your loop, they will all be "scan1".
"scan_2" will be called as each request completes.
Change the request to synchronous if you want to control the order of events.
You are starting by sending off three ajax calls at once.
Scan1 (loop 1)
Scan1 (loop 2)
Scan1 (loop 3)
When each Scan 1 completes, it's subsequent Scan 2, and then Scan 3 are called.
What did you actually want to happen? Scan 1 2 and 3 of loop 1, then 1 2 and 3 of loop 2, and then 1 2 and 3 of loop 3? That would require more nesting, or possibly deferred objects.
Instead of using the success callback for each $.ajax() call, you can store each set of AJAX requests (their jqXHR objects) in an array and wait for all of them to resolve:
function scan_1 () {
//setup array to store jqXHR objects (deferred objects)
var jqXHRs = [];
for(var i = 1; i <= 3; i++)
{
//push a new index onto the array, `$.ajax()` returns an object that will resolve when the response is returned
jqXHRs[jqXHRs.length] = $.ajax({
type: 'GET',
url: url,
data: 'do=scan&step=1&' + string,
dataType: 'json'
});
}
//wait for all four of the AJAX requests to resolve before running `scan_2()`
$.when(jqXHRs).then(function () {
if(result.proceed == 'true') {
scan_2();
}
});
}
I've had similar problems working heavily with SharePoint web services - you often need to pull data from multiple sources to generate input for a single process.
To solve it I embedded this kind of functionality into my AJAX abstraction library. You can easily define a request which will trigger a set of handlers when complete. However each request can be defined with multiple http calls. Here's the component (and detailed documentation):
DPAJAX at DepressedPress.com
This simple example creates one request with three calls and then passes that information, in the call order, to a single handler:
// The handler function
function AddUp(Nums) { alert(Nums[1] + Nums[2] + Nums[3]) };
// Create the pool
myPool = DP_AJAX.createPool();
// Create the request
myRequest = DP_AJAX.createRequest(AddUp);
// Add the calls to the request
myRequest.addCall("GET", "http://www.mysite.com/Add.htm", [5,10]);
myRequest.addCall("GET", "http://www.mysite.com/Add.htm", [4,6]);
myRequest.addCall("GET", "http://www.mysite.com/Add.htm", [7,13]);
// Add the request to the pool
myPool.addRequest(myRequest);
Note that unlike many of the other solutions provided this method does not force single threading of the calls being made - each will still run as quickly (or as slowly) as the environment allows but the single handler will only be called when all are complete. It also supports the setting of timeout values and retry attempts if your service is a little flakey.
I've found it insanely useful (and incredibly simple to understand from a code perspective). No more chaining, no more counting calls and saving output. Just "set it and forget it".