I have a piece of code I want to run after all the ajax is completed.
The function I wish to run is:
function autoContinueCart(){
$('.nextSection a:visible').click();
}
This click event runs validating script and moves to next section. Heres the main ajax.
$('#SubmitLoginOpc').click(function () {
$.ajax({
type:'POST',
url:authenticationUrl,
async:false,
cache:false,
dataType:"json",
data:'SubmitLogin=true&ajax=true&email=' + encodeURIComponent($('#login_email').val()) + '&passwd=' + encodeURIComponent($('#login_passwd').val()) + '&token=' + static_token,
success:function (jsonData) {
if (jsonData.hasError) {
//error stuff
}
else {
// update token
static_token = jsonData.token;
$('#dlv_label, #new_label').removeClass('new-l').addClass('logged-l'); // change label on delivery address section
updateNewAccountToAddressBlock();
// RESET ERROR(S) MESSAGE(S)
$('#opc_account_errors').html('').hide();
$('#opc_account_errors_invoice').html('').hide();
//It doesnt work here
//autoContinueCart();
}
},
//doesnt work here
// complete:autoContinueCart
});
return false;
});
I have put this function call in the success part, which I thought would work since it is synchronous. I also put it as complete and in .done function after the ajax call and it still runs before all the inside code is complete. The function updateNewAccountToAddressBlock(); basically makes another jquery ajax request with this type async:true, and returns json that is then used in about 10 functions or sub functions in the success call. One of these uses this data to fill out all the fields of a form. My function I am trying to call at the end is supposed to validate the info that is being populated. But no matter what I try, the validation is failing because the autoContineCart is being run before the fields are being populated. I also tried to use a callback like updateNewAccountToAddressBlock(updateAddressSelection); and then checked callback function inside of that and it also didnt work. Anyone have an idea what I could be doing wrong?
Since your call is already asynchronous, is it possible to move the processing code out of the ajax callback function? This would ensure that all of the ajax portion is complete before moving on to the processing piece.
Example:
$('#SubmitLoginOpc').click(function () {
$.ajax({
type:'POST',
url:authenticationUrl,
async:false,
cache:false,
dataType:"json",
data:'SubmitLogin=true&ajax=true&email=' + encodeURIComponent($('#login_email').val()) + '&passwd=' + encodeURIComponent($('#login_passwd').val()) + '&token=' + static_token
},
success: function(jsonData) {
$('#SubmitLoginOpc').data("some_key",jsonData);
}
//doesnt work here
// complete:autoContinueCart
});
jsonData = $('#SubmitLoginOpc').data("some_key");
if (jsonData.hasError) {
//error stuff
}
else {
// update token
static_token = jsonData.token;
$('#dlv_label, #new_label').removeClass('new-l').addClass('logged-l'); // change label on delivery address section
updateNewAccountToAddressBlock();
// RESET ERROR(S) MESSAGE(S)
$('#opc_account_errors').html('').hide();
$('#opc_account_errors_invoice').html('').hide();
//It doesnt work here
//autoContinueCart();
return false;
}
});
As the poster above said, Perhaps you could move some of the other ajax functions to run from the same js file, or move some of the Php functions to be run at the same time as the first call.
Ideally you shouldn't have to do another ajax request, because the php/whatever already has the info it needs from the client side. You should be able to send that data to other php/whatever scripts.
If you do need to do another ajax call, perhaps having the user wait a mandatory second, before you run the ajax call.
for instance:
$ajax.done(
// code
if (success)
{
setTimeout('foo', 5000);
$('#spinner').show();
}
function foo()
{
$('#spinner').hide();
//second ajax request
}
Related
I have what might be a tricky question.
I am working on a form where it verifies a couple things on submit, using event.preventDefault(); to prevent the form from submitting if something went wrong. The issue here is that it sends multiple ajax requests at the same time, which seems to stop the php (which is processing the AJAX call) from modifying the $_SESSION variable.
I have determined this by changing the jquery ajax calls to process synchronously, allowing the $_SESSION variable to be changed.
My question is this: is there a way to allow the ajax calls to happen synchronously while allowing the $_SESSION variable to be modified during the process of those calls? I realize that the async:false for an AJAX call is deprecated, and obviously not the best solution.
Due to what each call does, it is not possible to combine the functionality of these two calls, although each call does not take long at all to process.
Example jquery code to explain how I am making these AJAX calls (some redaction and simplification, obviously):
$("#form-id").on('submit', function(event) {
$.ajax({
type: 'POST',
url: '/url/to/processing.php',
async:false, //fails without setting to false
...
});
});
...
$("#form-id").on('submit', function(event) {
$.ajax({
type: 'POST',
url: '/url/to/processing2ThatSetsSession.php',
async:false, //fails without setting to false
...
});
});
You have to concat the calls, to run one call after the other has ended.
I'll do it this way:
function ajaxPost(url, callback) {
$.ajax({
type: 'POST',
url: url
...
}).done(callback);
}
$("#form-id").on('submit', function(event) {
event.preventDefault(); // Always stop the event
// Do one ajax call and wait for the data
ajaxPost('/url/to/processing.php', function(data) {
// Do things with returned data and call the next ajax
ajaxPost('/url/to/processing.php', function(moredata) {
// Do something with moredata
// If everything is fine, re-post it but this time do not catch the event
$("#form-id").off("submit").submit();
});
});
});
You can add your own logic to show your error message in any callback and not continue with the next one.
With this I'll do an special method for multiple ajax form validation:
// This function will get an array of objects and
// do an ajax call and process the data, one after another
function multiAjax(calls, callback) {
var call = calls.shift();
if (call) {
var url = call.url;
post(url, function(data) {
var error = call.process(data);
if (error) {
callback(error);
} else {
multiAjax(calls, callback);
}
});
} else {
callback();
}
}
// This is the array of objects that multiAjax will process.
// You can add or remove elements to your likings, without modifying
// the submit event callback
var ajaxArray = [{
url: '/url/to/processing.php',
process: function(data) {
if (data.isWrong()) {
return "The data is wrong";
}
}
}, {
url: '/url/to/processing.php',
process: function(data) {
if (data != "OK") {
return "The data is not OK";
}
}
}];
// Now listen for the submit event
$("#form-id").on('submit', function(event) {
// Always stop the event
event.preventDefault();
// Do multiple ajax calls in one function call.
// Because the array is mutated inside multiAjax() (yeah, bad design but I've
// done this fast as an example), we slice() the array to get a new one
multiAjax(ajaxArray.slice(), function(error) {
if (error) {
// Show the error received
} else {
// submit the form the same way above
$("#form-id").off("submit").submit();
}
});
});
This is all untested code, but you get the point.
If one form submission is making two posts to the same PHP server, you should rethink the architecture instead of building complicated workarounds.
I would POST to a single PHP script that will do everything you need in the backend.
$.ajax({
type: 'POST',
url: '/url/to/all-processing.php',
... // send all the data needed by all processes
});
On the PHP side: all-processing.php
session_start();
require_once './process1.php';
require_once './process2.php';
Im new to .ajax and so far so good. But I've run into an issue of, I want to run a function once I've used up all the data.
For example I have the following, i run it on 'click':
$.ajax({
url: "url.modal.tothegoods" + (nextPage),
success: function (data) {
//keeps appending data on click
},
error: function () {
alert('balls');
}
})
I've tried the ajaxcomplete function but it runs everytime i load data onto the screen.
It runs everytime i appened data .ajaxcomplete runs. I guess the questions is, how do I run a function once I have no more data to consume. So I am truly done
any tips/tricks would be greatly appreciated
The response depends on which is currently the behaviour of your server part.
If you have hand on it, the most simple solution is that it returns nothing when there is no more data to send. So you may do something as simple as this:
$.ajax({
url: "url.modal.tothegoods" + (nextPage),
success: function (data) {
if (!!data) {
//keeps appending data on click
}
},
error: function () {
alert('balls');
}
})
I need to check for a condition and run an AJAX call before sending other AJAX calls on my web app.
I was thinking about putting this AJAX call in a beforeSend on ajaxSetup with async: false (to prevent my initial call from running before this one has completed).
Something like this:
//I set an event that fires:
$.ajax({
type: "GET",
url: my_url,
beforeSend: function() {
//do something, like show a spinner loader gif
}
});
//Somehwere in my app I also have:
$.ajaxSetup({
beforeSend: function() {
if(x===1){
$.ajax({
type: "GET",
url: my_url/fetch_something,
async:false
});
}
}
});
Will my beforeSend on the first AJAX call overrun the one in the ajaxSetup? Is there a way to approach this better?
Better idea of my app:
I have a lot of Ajax calls through the app, each call sends a security hash on the headers to validate the user, these hashes have a time limit as well (both hash and time limit are saved in localStorage)
What I want from ajax setup (and the condition in it) is to check for the time limit - if time_limit < current_time than run an ajax call to refresh the users hash.
This isn't an exercise for 1 or 2 calls, I literally have 20+ growing Ajax calls on my app that make use of the users hash and it's very impractical to make this check in every single one of them.
UPDATED:
Have one method on an interval that sets up the 'session'/local-storage
var refreshing = false;
var intervalID;
$(document).ready(function(e){
var delay = 1234;
intervalID = window.setInterval(setupInterval, delay);
});
function setupInterval(){
refreshing = true;
$.ajax(URL).done(function(r) { //do stuff
setupStorage(r);
refreshing = false;
});
}
function setupStorage(info){
//setup whatever here
}
OLD:
Could you use some logic in your ready function to gate what you need to do?
So basically call one ajax call -> if false, just schedule your latter methods, otherwise run the setup one and on completion schedule the latter method.
Some pseudo-code:
var refresh = false;
$(document).ready(function(e){
$.ajax(URL).done( function(r) {
if(r) {
routeOne();
} else {
latter();
}
});
});
function routeOne(){
$.ajax(URL).done(function(r) { //do stuff
latter();
});
}
function latter(){
//All other ajax calls
}
I'll put some more thought into this let me finish my coffee first...
EDIT:
Based on your updated description could it be possible for you to schedule a setInterval to run the checking method/hash update on the time interval that you need, and is the time interval on your server static or variable? Facebook does this with a heartbeat, I've used this type of logic with some 'locking' functionality in a web-app. If you schedule the interval properly it should not interrupt any other ajax calls.
Try overriding $.ajax to make a "pre-call" before passing in your given query options:
var oldAjax = $.ajax;
$.ajax = function() {
var args = arguments;
oldAjax({
type: "GET",
url: "/echo/html/",
success: function(result){
// do something here to check result
// if result is good, do the request:
return oldAjax.apply($, args);
// if its bad, handle the error
}
});
}
Here's a fiddle to demonstrate: http://jsfiddle.net/NF76U/
I suggest the use of .done() ( $.Deferred object)
function AjaxCall() {
return //code of your ajax without async:false
}
function anotherAjaxCall{
return //code of you ajax call
}
AjaxCall.done(anotherAjaxCall);
Avoid using async:false it's a deprecated practice and it stucks browsers
in my script i have an ajax call to a php file and i would like to have the call completed before the rest of the script below gets executed
Here is the script i am referring to:
function game_settings(state){
if(state == "load"){
ui.load_game();
//do ajax call to load user last save
var dataString = {"user_data": "true"};
$.ajax({
type:"POST",
url:"PHP/class.ajax.php",
data: dataString,
dataType: 'JSON',
success: function(success) {
player_details_init(success)
},
error: function(){
alert("ERROR in User data");
}
});
console.log (player_level);
//scene.load("level_"+level);
//instantiate a heroObject
player = new Player(20,20,game_width, game_height);
// set it's image to the proper src URL
player.image.src = "images/Mage_Sprite.png";
// once the image has completed loading, render it to the screen
player.image.onload = function()
{
player.render();
};
lastUpdate = Date.now();
}
SO for instance right now when the script runs, i can see in console that the ajax request gets made, and then the rest of the script gets execute before the ajax has finished resulting in this call:
console.log (player_level);
to return undefined rather then the value it should be because the ajax hasn't completed.
To clarify my question is:
How should i make the ajax call finish before the rest of the script gets processed?
Thanks
Put the "rest of the script" inside the success callback function.
You can either put the rest of the script in the success callback of the AJAX call, or set the async property of the AJAX call config to false, which will make it synchronous.
I have the following code which is included in a keypress function:
$.getJSON('dimensions.json', function(data) {
$.each(data, function(index) {
$('#div1').append(index);
});
});
I'm trying to first get the JSON string, save it in a variable and then run the each(). I want to basically separate the each() to be unlinked to the getJSON() function because I don't want it to fetch the json file for every keypress.
I've tried this, but it didn't work:
var JSONstr = $.getJSON('dimensions.json');
$.each(JSONstr, function(index) {
$('#div1').append(index);
});
In your first example, you do $.each in the callback. The callback is executed by some other callback after there result is received, while $.getJSON returns immediately without waiting for the result (since there is no blocking in JavaScript by design).
Therefore the code in your second example can never work: the $.each begins before any result is received from the web server, probably even before the request is sent. Whatever the return value of $.getJSON is, it can't, by the design of JavaScript, be the result of AJAX request.
UPD: Saw your comment, now I understand what you wanted to do. Here's a simple example of how to do this:
function ActualHandler(data) {
$.each(data, function(index) {
$('#div1').append(index);
});
}
function KeypressHandler() {
if (window.my_data) { // If we have the data saved, work with it
ActualHandler(window.my_data);
}
else { // Otherwise, send the request, wait for the answer, then do something
$.getJSON('dimensions.json', function(data) {
window.my_data = data; // Save the data
ActualHandler(data); // And *then* work on it
});
}
}
Here, the ActualHandler is not launched before the data is received, and once that happens, all subsequent clicks will be handled immediately.
The downside in this particular case is that if user clicks again while the first request is running, one more will be sent. But to fix that you would need to maintain some queue, which is kind of out of scope here.
You fell into the asynchronous trap. Your $.each() function doesn't wait for your $.getJSON() call to get the data. You can get around this by using the good 'ol $.ajax() function. Like this:
function processJSON(data) {
$.each(data, function(index) {
$('#div1').append(index);
});
}
$.ajax({
url: 'dimensions.json',
dataType: 'json',
async: false,
success: processJSON(data)
});