Javascript is said to be single-threaded. Also AJAX is said to be asynchronous.
Consider a scenario;
I have a button and on click of it, I make a AJAX call which takes 5-6 seconds. Now the UI would not be blocked and the user does some other action (say click on another button which is now executing some code, while the AJAX response has been returned). Now in this case, since the other code is being executed, when would the AJAX callback be executed? Would it have to wait OR can it be executed in a parallel thread ?
The events are queued, so when the Ajax call completes, the handler for that would be queued to run on the event loop. When the single thread is done with your button handler, it'll then process the next event in the queue. So - you would have to wait for the code kicked off by the button click to finish, unless of course the Ajax request completed before the user clicked on the button, in which case the button click handler had to wait. The best you can do is split up your algorithm so that it runs in discrete chunks, these can be dropped onto the queue using setTimeout, but that is quite tricky.
So I have searched a little about this topic in general. Contrary to what I have imagined, javascript is nothing like multi-threaded. Instead, it has a queue of operations it performs.
The direct answer then is: Depending on the very exact timing, the AJAX callback might have to wait before click event completes. It also might have to wait for any other code that was executed at "the same moment".
This explains while things like while(true) or alert() stop every script on the site.
Related
I have a script I'm using to schedule resources using an API and a list in CSV format. The script currently loops through the CSV and fires off a call to a function that has the API calls in it. The AJAX calls are nested (Create a Reservation->Take the reservation number and add a resource->Validate the reservation->Submit the reservation). The problem is that after the original AJAX call, it seems to hang until all of the AJAX calls have completed. There doesn't seem to be any asynchronicity going on.
for(line in CSV)
{
makeAPICalls(line)
}
function makeAPICalls(line)
{
$.ajax("Create Reservation").then(function(){
$.ajax("Add Resource to Reservation").then(function(){
$.ajax("Validate Reservation").then(function(){
$.ajax("Confirm Reservation")
})
})
})
}
The first API call ("Create Reservation") completes, and then waits for all of the other lines in the CSV to make that call, then they ALL move on to the next step ("Add Resource to Reservation"). I was wondering if the system was just moving too quickly, so there wasn't a chance for everything to get "out of sync", so I added a delay before makeAPICalls(), but it still waited. Once the CSV loop finished, all the AJAX calls moves from ("Create Reservation") to the then("Add Resource to Reservation").
Is this as expected? Ideally I'd like each call to makeAPICalls() to finish as quickly as possible, with no regards for any other calls (which I kind of thought was what async was all about, but it doesn't seem to be happening here.
This is happening because you are chaining the requests. If your requests are not dependent on each other, you can call them without using .then().
The behaviour is quite correct. I don't know how you are putting in the delay but it probably won't help since JavaScript is single-threaded.
If you want all the steps to complete for a specific csv line you could have your function process the list one-by-one. You could even have the last step call back into the function with the next index to process.
I've noticed that setTimeout(function, milliseconds)
when used in middle of function will only be executed once function has ended regardless of timming given to it,
for example:
function doStuff(){
var begin = (new Date()).getTime();
console.log(begin);
setTimeout(function(){console.log("Timeout");},100);
doWork(4000);
var end = (new Date()).getTime();
console.log(end);
console.log("diff: "+(end-begin));
}
function doWork(num){
for(;num>0;--num) console.log("num");
}
doStuff();
the code above sets the timeout for 100 milliseconds but it is invoked only after all function completes which is much more then 100 milliseconds,
my question is:
why does this happen ?
how can i insure correct timing ?
JavaScript is not pre-emptive: it first finishes what it is doing before looking what is the next task that was posted in the queue (functions submitted for asynchronous execution). So even if a time-out expires, this will not influence the currently running code -- it does not get interrupted. Only when the stack of currently running functions have all returned and nothing remains to be run, JavaScript will check if there is a time-out request to be fulfilled, or whatever else is first in the queue.
From "Concurrency model and Event Loop" on MDN:
Queue
A JavaScript runtime contains a message queue, which is a list of messages to be processed. A function is associated with each message. When the stack is empty, a message is taken out of the queue and processed. The processing consists of calling the associated function (and thus creating an initial stack frame). The message processing ends when the stack becomes empty again.
Event loop
The event loop got its name because of how it's usually implemented, which usually resembles:
while(queue.waitForMessage()){
queue.processNextMessage();
}
queue.waitForMessage waits synchronously for a message to arrive if there is none currently.
"Run-to-completion"
Each message is processed completely before any other message is processed. This offers some nice properties when reasoning about your program, including the fact that whenever a function runs, it cannot be pre-empted and will run entirely before any other code runs (and can modify data the function manipulates). This differs from C, for instance, where if a function runs in a thread, it can be stopped at any point to run some other code in another thread.
A downside of this model is that if a message takes too long to complete, the web application is unable to process user interactions like click or scroll. The browser mitigates this with the "a script is taking too long to run" dialog. A good practice to follow is to make message processing short and if possible cut down one message into several messages.
To get code to run concurrently, you can make use of Web Workers. Web Workers run scripts in different threads.
When are callbacks executed, for example the callback of a setTimeout() or a click event?
Do they pause code, that is already running, or do they wait until it has finished?
Example
I have a data structure (incrementalChanges) that records state changes caused by user interactions, for example mouse clicks. If I want to send all changes to another peer, I send him this data structure.
Another possibility is a full synchronisation (makeFullSync()), that means I send him my complete current state, so that I must empty the incremental changes (deleteIncrementalChanges()). That is, what you can see in the code. However I am not sure, what happens, if a user clicks something exactly between these two function calls. If this event fires immediately, then an item to the incrementalChanges structure would be added, but then in the second call directly deleted, so that it will never be sent and the other peer's state would became invalid.
makeFullSync();
/* what if between these 2 calls a new change is made, that is saved in the
changes data structure, that will be deleted by deleteIncrementalChanges()?
Then this change would be lost? If I change the order it is not better ...
*/
deleteIncrementalChanges();
Some good links and, in the case the first scenario (it pauses running code) is true, solutions are welcomed.
Javascript is single threaded, and keeps an event stack of stuff it needs to get to once it's done running the current code it's working on. It will not start the next event in the stack until the current one is finished.
If you make multiple asynchronous calls, such as calls for a server to update data on another client, you need to structure your code to handle the case where they don't necessarily reach the second client in the same order.
If you're sending changes one at a time to another user, you can time stamp the changes to track what order they were made on the first client.
Do they pause code, that is already running, or do they wait until it has finished?
They wait until it has finished. JavaScript is single threaded, more than one piece of code can not run at once. JS uses an event loop to handle asynchronous stuff. If an event such as a click handler or timer firing happens while another piece of code is running, that event is queued up and runs after the currently running code finishes executing.
Assuming makeFullSync(); and deleteIncrementalChanges(); are called in the same chunk of code they will be executed one after another without any click events being processed until after they have both run.
One almost exception to the nothing runs in parallel rule in JS is WebWorkers. You can send data off to a worker for processing which will happen in another thread. Even though they run in parallel their results are inserted back into the event loop like any other event.
i have a question regarding partial page loading with AJAX.
Suppose that an user clicks on a button that makes an AJAX call to load part of a page (it can possibly include dynamically loaded JS and/or CSS), and the html content is dropped on some div. Then, before the first load is complete he clicks on another button that makes another AJAX call that drops other content on the same div. How should i prevent this behaviour to create any conflicts? Some possible conflicts might be something like, for example, the first load executes some JS on content that is not found because the second load already changed that div.
Thanks in advance.
Edit:
I would appreciate answers based on asynchronous methods. Thanks.
Genesis and Gaurav are right about disabling user interaction. +1 from me to each of them. How you handle the logic is actually quite simple:
$('#my_submit_button').click(function(){
$.ajax({
url:'/my_file.php',
dataType='json',
beforeSend:function(){
$('#my_submit_button').prop('disabled',true);
},
error: function(jqXHR, status, error){
// handle status for each: "timeout", "error", "abort", and "parsererror"
// Show submit button again:
$('#my_ajax_container').html('Oops we had a hiccup: ' + status);
$('#my_submit_button').prop('disabled',false);
},
success:function(data){
$('#my_ajax_container').html(data);
$('#my_submit_button').prop('disabled',false);
}
});
});
make it synchronous (not recommended)
disable link/button while ajaxing
do not mind about it
but in your case it won't do any conflicts because when html is replaced, scripts too
Just disable the buttons that cause the AJAX calls to start while one has not completed yet.
I'm not sure this would actually be a problem for you because Javascript is single threaded. When the first ajax response comes in and you execute some javascript, that javascript cannot be interupted by the second ajax response as long as it is one continuous thread of execution (no timers or other asynchronous ajax calls as part of it's processing).
Let's run through a scenario:
User clicks button - first ajax call starts.
User clicks button - second ajax call starts.
First ajax call finishes and the completion code execution starts for what to do with the new data.
While code is executing from first ajax call, the second ajax call completes. At this point, the browser puts the second ajax call completion into a queue. It cannot trigger any of your completion code yet because the Javascript engine is still running from the first load.
Now the first load completes it's execution and code and returns from it's completion handler.
The browser now goes to it's queue and finds the next event to process. It finds the completion of the second ajax call and then starts the completion code for that ajax call.
As you can see from this scenario which has overlapping ajax calls and the second completing in the middle of the processing the first, there still is no conflict because the Javascript engine is single threaded.
Now, as the other answers have suggested, you make not want this user experience of launching a new request while one is still processing, but they won't technically conflict with each other. You have several tools you can choose from if you want to prevent overlapping calls:
You can prevent starting the second call while the first call is unfinished. You can do this both in the UI and in the actual code.
When there are multiple calls outstanding, you can decide to drop/ignore the earlier responses and not process them - waiting only for the last response.
When the second call is initiated, you can cancel the first call.
You can let the second just replace the first as in the above scenario.
The first two options require you to keep track of some cross ajax-call state so one ajax call can know whether there are others and act accordingly.
i have a page with many actions on it, which triggers with $.get, but i want to run one at a time, rather then all triggering at once, that is lots of load time.. so what is the solution for me on this?
well before you give answer.. i dont want to trigger based on time, i want to trigger after each request is completely done from ajax, then go and continue with loop for ajax after first one is done.
Do you want to execute synchronous requests? if so, you need to use jQuery's ajax method instead of get, setting async:false.
check http://api.jquery.com/jQuery.ajax/
EDIT
As one commenter properly pointed, making sync requests hangs the only thread javascript code has. That means no animations or other code running while you wait for the requests to finish.
An interesting option would be to "chain" your requests, executing the next request in the previous one callback like this:
$.get('ajax/first-call.html', function(data) {
$.get('ajax/second-call.html', function(data){
//etc
}
});
You can setup your own custom queue in jQuery.
http://api.jquery.com/queue/
Populate your queue with all the functions you want to execute.
Each function is a single call to $.get().
In the callback for each $.get function, call the dequeue() function to start up the next ajax call.
It sounds like you want an ajax queue.
I've used this plugin before, and it's pretty simple.
Most browsers will only request four HTTP calls at once for the same domain. The others will be queued up and executed in serial. So the browser already implements some queuing on these requests.