jquery $.get(...) one at a time - javascript

i have a page with many actions on it, which triggers with $.get, but i want to run one at a time, rather then all triggering at once, that is lots of load time.. so what is the solution for me on this?
well before you give answer.. i dont want to trigger based on time, i want to trigger after each request is completely done from ajax, then go and continue with loop for ajax after first one is done.

Do you want to execute synchronous requests? if so, you need to use jQuery's ajax method instead of get, setting async:false.
check http://api.jquery.com/jQuery.ajax/
EDIT
As one commenter properly pointed, making sync requests hangs the only thread javascript code has. That means no animations or other code running while you wait for the requests to finish.
An interesting option would be to "chain" your requests, executing the next request in the previous one callback like this:
$.get('ajax/first-call.html', function(data) {
$.get('ajax/second-call.html', function(data){
//etc
}
});

You can setup your own custom queue in jQuery.
http://api.jquery.com/queue/
Populate your queue with all the functions you want to execute.
Each function is a single call to $.get().
In the callback for each $.get function, call the dequeue() function to start up the next ajax call.

It sounds like you want an ajax queue.
I've used this plugin before, and it's pretty simple.

Most browsers will only request four HTTP calls at once for the same domain. The others will be queued up and executed in serial. So the browser already implements some queuing on these requests.

Related

What is the best way to show custom message loader when sequence of AJAX calls are executing?

When set of JQuery AJAX calls are executing synchronously I would like to show a message which AJAX call is curring executing, so that user can wait till all the calls executed.
What I tried is using the JQuery,
LoadAllData();
function LoadAllData()
{
$('#spMsg').text('Wait, fetching data1...);
ajaxCallForData1();
$('#spMsg').text('Wait, fetching data2...);
ajaxCallForData2();
$('#spMsg').text('Wait, fetching data3...);
ajaxCallForData3();
}
But the issue is $('#spMsg') is not updating unless all the AJAX calls are executed.
I have tried the JQuery Block UI plug in, but works good for single ajax call but not for sequence of calls in a function. What is the best way to fix this?

Wrapping AJAX in a function seems to make it lose async?

I have a script I'm using to schedule resources using an API and a list in CSV format. The script currently loops through the CSV and fires off a call to a function that has the API calls in it. The AJAX calls are nested (Create a Reservation->Take the reservation number and add a resource->Validate the reservation->Submit the reservation). The problem is that after the original AJAX call, it seems to hang until all of the AJAX calls have completed. There doesn't seem to be any asynchronicity going on.
for(line in CSV)
{
makeAPICalls(line)
}
function makeAPICalls(line)
{
$.ajax("Create Reservation").then(function(){
$.ajax("Add Resource to Reservation").then(function(){
$.ajax("Validate Reservation").then(function(){
$.ajax("Confirm Reservation")
})
})
})
}
The first API call ("Create Reservation") completes, and then waits for all of the other lines in the CSV to make that call, then they ALL move on to the next step ("Add Resource to Reservation"). I was wondering if the system was just moving too quickly, so there wasn't a chance for everything to get "out of sync", so I added a delay before makeAPICalls(), but it still waited. Once the CSV loop finished, all the AJAX calls moves from ("Create Reservation") to the then("Add Resource to Reservation").
Is this as expected? Ideally I'd like each call to makeAPICalls() to finish as quickly as possible, with no regards for any other calls (which I kind of thought was what async was all about, but it doesn't seem to be happening here.
This is happening because you are chaining the requests. If your requests are not dependent on each other, you can call them without using .then().
The behaviour is quite correct. I don't know how you are putting in the delay but it probably won't help since JavaScript is single-threaded.
If you want all the steps to complete for a specific csv line you could have your function process the list one-by-one. You could even have the last step call back into the function with the next index to process.

Dynamic page fetching with AJAX

i have a question regarding partial page loading with AJAX.
Suppose that an user clicks on a button that makes an AJAX call to load part of a page (it can possibly include dynamically loaded JS and/or CSS), and the html content is dropped on some div. Then, before the first load is complete he clicks on another button that makes another AJAX call that drops other content on the same div. How should i prevent this behaviour to create any conflicts? Some possible conflicts might be something like, for example, the first load executes some JS on content that is not found because the second load already changed that div.
Thanks in advance.
Edit:
I would appreciate answers based on asynchronous methods. Thanks.
Genesis and Gaurav are right about disabling user interaction. +1 from me to each of them. How you handle the logic is actually quite simple:
$('#my_submit_button').click(function(){
$.ajax({
url:'/my_file.php',
dataType='json',
beforeSend:function(){
$('#my_submit_button').prop('disabled',true);
},
error: function(jqXHR, status, error){
// handle status for each: "timeout", "error", "abort", and "parsererror"
// Show submit button again:
$('#my_ajax_container').html('Oops we had a hiccup: ' + status);
$('#my_submit_button').prop('disabled',false);
},
success:function(data){
$('#my_ajax_container').html(data);
$('#my_submit_button').prop('disabled',false);
}
});
});
make it synchronous (not recommended)
disable link/button while ajaxing
do not mind about it
but in your case it won't do any conflicts because when html is replaced, scripts too
Just disable the buttons that cause the AJAX calls to start while one has not completed yet.
I'm not sure this would actually be a problem for you because Javascript is single threaded. When the first ajax response comes in and you execute some javascript, that javascript cannot be interupted by the second ajax response as long as it is one continuous thread of execution (no timers or other asynchronous ajax calls as part of it's processing).
Let's run through a scenario:
User clicks button - first ajax call starts.
User clicks button - second ajax call starts.
First ajax call finishes and the completion code execution starts for what to do with the new data.
While code is executing from first ajax call, the second ajax call completes. At this point, the browser puts the second ajax call completion into a queue. It cannot trigger any of your completion code yet because the Javascript engine is still running from the first load.
Now the first load completes it's execution and code and returns from it's completion handler.
The browser now goes to it's queue and finds the next event to process. It finds the completion of the second ajax call and then starts the completion code for that ajax call.
As you can see from this scenario which has overlapping ajax calls and the second completing in the middle of the processing the first, there still is no conflict because the Javascript engine is single threaded.
Now, as the other answers have suggested, you make not want this user experience of launching a new request while one is still processing, but they won't technically conflict with each other. You have several tools you can choose from if you want to prevent overlapping calls:
You can prevent starting the second call while the first call is unfinished. You can do this both in the UI and in the actual code.
When there are multiple calls outstanding, you can decide to drop/ignore the earlier responses and not process them - waiting only for the last response.
When the second call is initiated, you can cancel the first call.
You can let the second just replace the first as in the above scenario.
The first two options require you to keep track of some cross ajax-call state so one ajax call can know whether there are others and act accordingly.

Question about make several request to a server using AJAX?

How i can make several requests to a server all at the same time, but preventing the mix of the responses?
Each of the ajax requests is made separately and you should set them up so that they go to different handlers when the ajax request is finished. The handlers may not be called in the same order since each may take longer than another.
If your code requires that they come back in the same order, you should create a single call that returns all the values you need. Or you need to queue the responses until they have all been processed

Any issue with setTimeout calling itself?

I have read a handful of setTimeout questions and none appear to relate to the question I have in mind. I know I could use setInterval() but it is not my prefered option.
I have a web app that could in theory be running a day or more without a page refresh and more than one check a minute. Am I likely to get tripped up if my function calls itself several hundred (or more) times using setInterval? Will I reach "stack overflow" for example?
If I use setInterval there is a remote possibility of two requests being run at the same time especially on slow connections (where a second one is raised before the first is finished). I know I could create flags to test if a request is already active, but I'm afraid of false positives.
My solution is to have a function call my jquery ajax code, do its thing, and then as part of ajaxComplete, it does a setInterval to call itself again in X seconds. This method also allows me to alter the duration between calls, so if my server is busy(slow), one reply can set a flag to increase the time between ajax calls.
Sample code of my idea...
function ServiceOrderSync()
{
// 1. Sync stage changes from client with the server
// 2. Sync new orders from server to this client
$.ajax( { "data": dataurl,
"success": function(data)
{ // process my data },
"complete": function(data)
{
// Queue the next sync
setTimeout( ServiceOrderSync, 15000 );
});
}
You won't get a stack overflow, since the call isn't truly recursive (I call it "pseudo-recursive")
JavaScript is an event driven language, and when you call setTimeout it just queues an event in the list of pending events, and code execution then just continues from where you are and the call stack gets completely unwound before the next event is pulled from that list.
p.s. I'd strongly recommend using Promises if you're using jQuery to handle async code.

Categories

Resources