I frequently need to load a function on a webpage after between two and five modest-sized data files have loaded. Let's say a maximum of 3MB of data split over a maximum of five files.
I try to optimize the load time by making all the AJAX calls at the same time and loading an initialize() function after they have all loaded, like so:
var data1, data2;
$(document).ajaxStop(function() {
$(this).unbind("ajaxStop"); //prevent running again when other calls finish
initialize();
});
$.ajax({
url: url_to_data_1,
success: function (d) {
data1 = d;
}
});
$.ajax({
url: url_to_data_2,
success: function (d) {
data2 = d;
}
});
function initialize() { /* do something with data1 and data2 */ }
But I'm tired of pasting this in every time, so I want a function like this:
function multi_Ajax(urls, callback) {
var data = {};
//call init() after both the list of prayers and the word concordance index load
$(document).ajaxStop(function() {
$(this).unbind("ajaxStop"); //prevent running again when other calls finish
callback(data);
});
for (var c = 0; c < urls.length; c += 1) {
//data files
$.ajax({
url: urls[c],
dataType: "json",
success: function (d) {
data[urls[c]] = d; console.log("Loaded " + urls[c]);
}
});
}
}
This does not work, of course, since the ajax calls do not exist for ajaxStop to catch. But I do not understand how ajaxStop works well enough to get much further. Thank you!
I'm not sure why your second attempt wouldn't work. According to the documentation, whenever an ajax request completes, jquery will check if there are any other requests that are still outstanding and fires ajaxStop if there are none. Calling $.ajax in a loop shouldn't be any different than hardcoding each call, as far as I can tell.
However, as Barmar suggested in the comments, $.when seems like a cleaner way to do what you want to do. Here's an example of how to pass an array (which you can populate in a loop) to $.when: Pass in an array of Deferreds to $.when()
Using $.when seems cleaner than $.ajaxStop because if someone later comes along and adds an unrelated ajax request somewhere before or after your loop, that would interfere with when ajaxStop triggers. $.when allows you to explicitly say which ajax requests you want to wait for.
EDIT: Here's a fiddle showing ajaxStop working for multiple calls issued in a loop: http://jsfiddle.net/zYk5W/
It looks like the reason this wasn't working for you has nothing to do with .ajax or .ajaxStop; instead it's a scope issue in your success callback. Your success callback closes over c, but this is the same c that the outer (loop) scope uses. By the time any of the success callbacks runs, the for loop has completed and c has been incremented to urls.length. Thus every time your success callback runs, urls[c] is undefined. See the fiddle I linked or JavaScript closure inside loops – simple practical example for an example of how to give each success callback its own c in a scope separate from the loop's c.
Related
I have a database with different link, I want to go fetch these link and put the inside an array.
I tried with the following code:
var amz=new Array();
function CreaArrayAmazon()
{$.ajax({
url: "php/amazon_affiliate.php",
success: function(data){
var leanamazon = JSON.parse(data);
for (i=0; i<leanamazon.length; i++)
{amz[i]=leanamazon[i].Link
}
}
})
}
I expect to find all the links in the "amz" array because it is a global variable, instead it saves links only when it is inside the AJAX function.
If I insert an "alert" inside the AJAX function (ex. alert(amz[i])) I can correctly see the data, instead if I insert an alert outside that I can't see anything, infact the amz array results to be empity.
Can someone tell me out to take that data out of there?
You might be misunderstanding what is going on here.
AJAX stands for Asynchronous Javascript and XML. Asynchronous means that your code doesn't always run in order.
In this case, your program functions like so../
function CreaArrayAmazon()
{
// Step 1: Make the Call
$.ajax({
url: "php/amazon_affiliate.php",
success: function(data){
// Step 3: When the call succeeds, execute the rest of this inner function.
var leanamazon = JSON.parse(data);
for (i=0; i<leanamazon.length; i++)
{amz[i]=leanamazon[i].Link
}
}
})
// Step 2: Continue Processing....
}
Step 2 happens far before Step 3. By the time your AJAX call finished, Javascript has already finished executing your CreaArrayAmazon call.
Instead, you need to have your inner function (Step 3) call an outside function to react to the new data you've received.
I'm not sure if this will actually be possible, since load() is an asynchronous method, but I need some way to basically Load several little bits of pages, one at a time, get some data included in them via JavaScript, and then send that over via Ajax so I can put it on a database I made.
Basically I get this from my page, where all the links I'll be having to iterate through are located:
var digiList = $('.2u');
var link;
for(var i=0;i<digiList.length;i++){
link = "http://www.digimon-heroes.com" + $(digiList).eq(i).find('map').children().attr('href');
So far so good.
Now, I'm going to have to load each link (only a specific div of the full page, not the whole thing) into a div I have somewhere around my page, so that I can get some data via JQuery:
var contentURI= link + ' div.row:nth-child(2)';
$('#single').load('grabber.php?url='+ contentURI,function(){
///////////// And I do a bunch of JQuery stuff here, and save stuff into an object
///////////// Aaaand then I call up an ajax request.
$.ajax({
url: 'insertDigi.php',
type: 'POST',
data: {digimon: JSON.stringify(digimon)},
dataType: 'json',
success: function(msg){
console.log(msg);
}
////////This calls up a script that handles everything and makes an insert into my database.
}); //END ajax
}); //END load callback Function
} //END 'for' Statement.
alert('Inserted!');
Naturally, as would be expected, the loading takes too long, and the rest of the for statement just keeps going through, not really caring about letting the load finish up it's business, since the load is asynchronous. The alert('Inserted!'); is called before I even get the chance to load the very first page. This, in turn, means that I only get to load the stuff into my div before I can even treat it's information and send it over to my script.
So my question is: Is there some creative way to do this in such a manner that I could iterate through multiple links, load them, do my business with them, and be done with it? And if not, is there a synchronous alternative to load, that could produce roughly the same effect? I know that it would probably block up my page completely, but I'd be fine with it, since the page does not require any input from me.
Hopefully I explained everything with the necessary detail, and hopefully you guys can help me out with this. Thanks!
You probably want a recursive function, that waits for one iteration, before going to the next iteration etc.
(function recursive(i) {
var digiList = $('.2u');
var link = digiList.eq(i).find('map').children().attr('href') + ' div.row:nth-child(2)';
$.ajax({
url: 'grabber.php',
data: {
url: link
}
}).done(function(data) {
// do stuff with "data"
$.ajax({
url: 'insertDigi.php',
type: 'POST',
data: {
digimon: digimon
},
dataType: 'json'
}).done(function(msg) {
console.log(msg);
if (i < digiList.length) {
recursive(++i); // do the next one ... when this is one is done
}
});
});
})(0);
Just in case you want them to run together you can use closure to preserve each number in the loop
for (var i = 0; i < digiList.length; i++) {
(function(num) { < // num here as the argument is actually i
var link = "http://www.digimon-heroes.com" + $(digiList).eq(num).find('map').children().attr('href');
var contentURI= link + ' div.row:nth-child(2)';
$('#single').load('grabber.php?url=' + contentURI, function() {
///////////// And I do a bunch of JQuery stuff here, and save stuff into an object
///////////// Aaaand then I call up an ajax request.
$.ajax({
url: 'insertDigi.php',
type: 'POST',
data: {
digimon: JSON.stringify(digimon)
},
dataType: 'json',
success: function(msg) {
console.log(msg);
}
////////This calls up a script that handles everything and makes an insert into my database.
}); //END ajax
}); //END load callback Function
})(i);// <-- pass in the number from the loop
}
You can always use synchronous ajax, but there's no good reason for it.
If you know the number of documents you need to download (you can count them or just hardcode if it's constant), you could run some callback function on success and if everything is done, then proceed with logic that need all documents.
To make it even better you could just trigger an event (on document or any other object) when everything is downloaded (e.x. "downloads_done") and listen on this even to make what you need to make.
But all above is for case you need to do something when all is done. However I'm not sure if I understood your question correctly (just read this again).
If you want to download something -> do something with data -> download another thing -> do something again...
Then you can also use javascript waterfall (library or build your own) to make it simple and easy to use. On waterfall you define what should happen when async function is done, one by one.
Here's my issue. I have a js function that performs an $.ajax call to fetch some data from a server. When it gets that data back, I need to pass control back to the browser in order to show an update to a div.
The js function is itself within a for loop, and I need to ensure that the for loop does not advance until the js function has updated the div and allowed the Browser to display that update, at which point the for loop advances and the js function (with its ajax call) is called again, continuing until the for loop test causes the loop to end.
I've tried many different approaches - callbacks, promises etc, but to date I can't seem to get a handle on ensuring that the loop doesn't advance until the js function gets its server data, updates the div, causes the browser to display that update and fully completes.
Here's a simple stripped-down version of the function:
function myFunction (email) {
var request = $.ajax( {
url: 'getit.php',
cache: false,
async: false,
method: "post",
timeout: 1000,
data: "requesttype=getemailname&email="+encodeURIComponent(email)
});
request.done(function(response) {
$("#myDiv").html(response);
});
}
and here's part of the js that calls it:
.....
var emailscount = emails.length;
for(var i=0;i<emailscount;i++) {
myFunction (emails[i]);
}
.....
So, my issues are:
1) myFunction must allow the browser to display the updated div html - I'm not sure how to achieve that?
2) the for loop should only proceed when myFunction has received the data back from the server, updated the div html, AND allowed the browser to display that div.
At the moment, I have set the $.ajax call async flag set to "false" to stop execution until the data comes back, but how do I ensure the browser displays the new div content, and that the for loop does not proceed to call myFunction again until the previous myFunction call fully completes?
Any help you can give me would be very welcome, as right now I can't get this all to work!
Sounds like you need a recursive function, not a for loop with synchronous ajax calls
(function myFunction(i) {
$.ajax({
url: 'getit.php',
method: "post",
timeout: 1000,
data: {
requesttype : 'getemailname',
email : emails[i]
}
}).done(function(response) {
$("#myDiv").html(response);
if (emails[++i]) myFunction(i); // continue when this one is done
});
})(0);
Thanks for everyone's help! I'm making good progress (including taking care of JQuery deprecations!) but have run into a further problem. As I need to hand control back to the browser in order to show the refreshed div as I recurse, I'm calling a setTimeout as follows:
var nextBitOfWork = function () {
return myFunction(email);
};
setTimeout(nextBitOfWork, 0);
where myFunction (which recurses) now returns a promise when it's done doing it's $.ajax call.
If I simply call:
return myFunction(email);
without the setTimeout function construct above, the promise is passed through and all my promises are captured and allow me to get the array output I need and everything works great. But without the setTimeout I don't get the browser refresh. Using it as above I get the div update refresh displaying, but seem to lose the promise and so the script continues and I don't get to fill the array I use to capture values as I recurse.
Any thoughts on how to make sure the setTimeout passes on the promise?
Thanks
I have the following code which is included in a keypress function:
$.getJSON('dimensions.json', function(data) {
$.each(data, function(index) {
$('#div1').append(index);
});
});
I'm trying to first get the JSON string, save it in a variable and then run the each(). I want to basically separate the each() to be unlinked to the getJSON() function because I don't want it to fetch the json file for every keypress.
I've tried this, but it didn't work:
var JSONstr = $.getJSON('dimensions.json');
$.each(JSONstr, function(index) {
$('#div1').append(index);
});
In your first example, you do $.each in the callback. The callback is executed by some other callback after there result is received, while $.getJSON returns immediately without waiting for the result (since there is no blocking in JavaScript by design).
Therefore the code in your second example can never work: the $.each begins before any result is received from the web server, probably even before the request is sent. Whatever the return value of $.getJSON is, it can't, by the design of JavaScript, be the result of AJAX request.
UPD: Saw your comment, now I understand what you wanted to do. Here's a simple example of how to do this:
function ActualHandler(data) {
$.each(data, function(index) {
$('#div1').append(index);
});
}
function KeypressHandler() {
if (window.my_data) { // If we have the data saved, work with it
ActualHandler(window.my_data);
}
else { // Otherwise, send the request, wait for the answer, then do something
$.getJSON('dimensions.json', function(data) {
window.my_data = data; // Save the data
ActualHandler(data); // And *then* work on it
});
}
}
Here, the ActualHandler is not launched before the data is received, and once that happens, all subsequent clicks will be handled immediately.
The downside in this particular case is that if user clicks again while the first request is running, one more will be sent. But to fix that you would need to maintain some queue, which is kind of out of scope here.
You fell into the asynchronous trap. Your $.each() function doesn't wait for your $.getJSON() call to get the data. You can get around this by using the good 'ol $.ajax() function. Like this:
function processJSON(data) {
$.each(data, function(index) {
$('#div1').append(index);
});
}
$.ajax({
url: 'dimensions.json',
dataType: 'json',
async: false,
success: processJSON(data)
});
I've got a particular function I want to run once, and only after the completion of several AJAX requests.
My current solution looks a bit like this:
function doWork() {
//This is the function to be run once after all the requests
}
//some tracking/counting variables
var ajaxDoneCounter = 0;
var numOfAjaxRequests = 5;
var workDone = false;
function doWorkTrigger() {
ajaxDoneCounter++;
if( !workDone && ajaxDoneCounter >= numOfAjaxRequests ) {
workDone = true;
doWork();
}
}
// ...
//and a number of ajax requests (some hidden within functions, etc)
//they look something like this:
$.ajax({
url: "http://www.example.com",
dataType: "json",
success: function( data ) {
//load data in to variables, etc
doWorkTrigger();
}
});
One obvious pitfall in the above is that any AJAX call that is not successful will not increment ajaxDoneCount and so doWork() will probably never be called. I can get around that using the error callback in inside any $.ajax, so that doesn't worry me too much.
What I want to know is whether the above is safe and/or good practice?
Is there a trick I've missed, or any thing else that might work better?
Update: Since jQuery 1.5, deferred objects [docs] provide a cleaner solution. Have a look at an example here.
I would use .ajaxComplete(), it will be triggered whenever an Ajax call completed (success or error):
var numOfAjaxRequests = 5;
$(document).ajaxComplete(function() {
numOfAjaxRequests--;
if(!numOfAjaxRequests) {
doWork();
}
});
Then you don't have to edit every Ajax request.
You could even use .ajaxSend() to get notified of starting Ajax requests, instead of hardcoding it (but I am not sure whether this really works, maybe you will experience race conditions):
var numOfAjaxRequests = 0;
$(document).ajaxSend(function() {
numOfAjaxRequests++;
});
I think you should use complete(XMLHttpRequest, textStatus) ajax event instead of success(data, textStatus, XMLHttpRequest).
According to jQuery help:
complete(XMLHttpRequest, textStatus)
A function to be called when the
request finishes (after success and
error callbacks are executed). The
function gets passed two arguments:
The XMLHttpRequest object and a string
describing the status of the request.
This is an Ajax Event.
I don't know enough about JavaScript internals, but there is a danger that the operation:
ajaxDoneCounter++;
is not atomic. If that is the case, then this could be subject to a race condition.