Wait for function to finish before starting again - javascript

Good Morning,
I am trying to call the same function everytime the user presses a button. Here is what happens at the moment..
User clicks button -> Calls function -> function takes 1000ms+ to finish (due to animation with jQuery and AJAX calls)
What I want to happen is every time the user presses the button it adds the function to the queue, waits for the previous call to finish, and then starts..
Is this possible?
Sorry if my explanation is a bit confusing..
Thanks Matthew

Since Functions are objects they can store properties. You can have your function increment a 'queue' property on itself every time its called (button pressed) and then at the end of the function execution, check if the queue property of itself is > 0, and if so, decrement the queue property by one and call itself again using setTimeout.

You could add a callback function to the one that makes the animation, once the animation finishes the callback it's called so you can do what you want.
Here follows an example from the jquery docs:
$('#clickme').click(function() {
$('#book').animate({
opacity: 0.25,
left: '+=50',
height: 'toggle'
}, 5000, function() {
// Put your code here, so it will be executed once the animation completes
});
});
P.S.: Morning? It's almost midnight ;D

You could keep track of a global queue/array where you add to the queue each time the user presses the button, then you'll want a callback in your function (such as in the success callback for jQuery) that does the work to check the queue and call the next function on the queue.

I recently had to do this in my own application. The best suggestion I have for you is (assuming you already have your AJAX calls inside of a JavaScript function) to pass a callback function parameter into your JavaScript function. Then, in the success section of your AJAX call, call your callback function.
Example:
// In HTML <script> tags.
$(document).ready(function () {
$("button#queue").click(function() {
GetMembers(GetGroups(), MemberCallback));
});
});
function GetMembers(userId, callback) {
$.ajax({
//Initialized AJAX properties
url: "/My/Project/Ajax/Page.extension",
data: "{ 'groupId':'" + groupId + "'}",
success: function (result) {
callback(result.d);
}
});
}
function MemberCallback(members) {
// Do your thing here.
}

Disable the button when they click it. Enable it when the ajax routine is complete.

Related

How is this jQuery code creating an event?

Suppose I have this code:
$('#button').on('click', function () {
$('#status').text('doing some work....');
somethingThatTakes20Seconds('#status');
});
Here, somethingThatTakes20Seconds will be executed before the "doing some work" statement. While I understand the statement itself creates a DOM event which gets placed into the event queue and waits until the stack is clear to execute, what I don't get is how it's doing it (on a high level). Is the .text method asynchronous, in the same way like setTimeout() is (just with .text you don't take any callback, or is the callback auto-generated, basically some code that updates the DOM)?
The text() method is not asynchronous. $('#status').text('doing some work....'); will execute before somethingThatTakes20Seconds('#status');.
You can define your somethingThatTakes20Seconds() method to register a callback - which internally adds a listener to the JavaScript engine. When the listener "hears" something - ie. an AJAX request is completely or an user performs an action, it adds an item to the message queue.
This is where the event loop comes in. The event loop takes this message queue item and then calls the callback function associated with it.
No, .text() method is not asynchronous.
JavaScript only executes your code each line in sequence. It does not wait until each line execution is completed to execute the next line.
In your code, basically what you're doing is:
Change the #status text to doing some work.....
Run somethingThatTakes20Seconds() function.
It's just impossible for your function to do the 2nd function for 20 seconds, and then run the 1st one considering how fast it is to run the 1st function.
Consider the following sample,
function addTextOne() {
$('#result').append('This is text 1.\n');
}
function addTextTwo() {
$('#result').append('This is text 2.\n');
}
$(function() {
$("#testBtn").on('click', function() {
setTimeout(addTextOne, 1001);
setTimeout(addTextTwo, 1000);
});
});
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
<button id="testBtn">
Test
</button>
<br/>
<textarea id="result" cols='70' rows='30'></textarea>
Note: The code sample basically executes addTextOne before addTextTwo on button click. But, addTextTwo finishes earlier than addTextOne.
The browser will not render the DOM to show the "doing some work..." message until after the function returns. Assuming your somethingThatTakes20Seconds function does not return for 20 seconds, you will not see that message for 20 seconds. You probably want to do something like this:
$('#button').on('click', function () {
$('#status').text('doing some work....');
setTimeout(function () { somethingThatTakes20Seconds('#status'); }, 10);
});
This will allow the DOM to be rendered before starting the long-running process.
For example: https://jsfiddle.net/9eq85udz/2/

Critical Section in JavaScript or jQuery

I have a webpage, in which a certain Ajax event is triggered asynchronously. This Ajax section could be called once or more than once. I do not have control over the number of times this event is triggered, nor the timing.
Also, there is a certain code in that Ajax section that should run as a critical section, meaning, when it is running, no other copy of that code should be running.
Here is a pseudo code:
Run JavaScript or jQuery code
Enter critical section that is Ajax (when a certain process is waiting for a response callback, then do not enter this section again, until this process is done)
Run more JavaScript or jQuery code
My question is, how can I run step 2 the way described above? How do I create/guarantee a mutual exclusion section using JavaScript or jQuery.
I understand the theory (semaphores, locks, ...etc.), but I could not implement a solution using either JavaScript or jQuery.
EDIT
In case you are suggesting a Boolean variable to get into the critical section, this would not work, and the lines below will explain why.
the code for a critical section would be as follows (using the Boolean variable suggestions):
load_data_from_database = function () {
// Load data from the database. Only load data if we almost reach the end of the page
if ( jQuery(window).scrollTop() >= jQuery(document).height() - jQuery(window).height() - 300) {
// Enter critical section
if (window.lock == false) {
// Lock the critical section
window.lock = true;
// Make Ajax call
jQuery.ajax({
type: 'post',
dataType: 'json',
url: path/to/script.php,
data: {
action: 'action_load_posts'
},
success: function (response) {
// First do some stuff when we get a response
// Then we unlock the critical section
window.lock = false;
}
});
// End of critical section
}
}
};
// The jQuery ready function (start code here)
jQuery(document).ready(function() {
var window.lock = false; // This is a global lock variable
jQuery(window).on('scroll', load_data_from_database);
});
Now this is the code for the lock section as suggested using a Boolean variable. This would not work as suggested below:
The user scrolls down, (and based on the association jQuery(window).on('scroll', load_data_from_database); more than one scroll event is triggered.
Assume two scroll events are triggered right at almost the same moment
Both call the load_data_from_database function
The first event checks if window.lock is false (answer is true, so if statement is correct)
The second event checks if window.lock is false (answer is true, so if statement is correct)
The first event enters the if statement
The second event enters the if statement
The first statement sets window.lock to true
The second statement sets window.lock to true
The first statement runs the Ajax critical section
The second statement runs the Ajax critical section.
Both finish the code
As you notice, both events are triggered almost at the same time, and both enter the critical section. So a lock is not possible.
I think the most helpful information you provided above was your analysis of the locking.
The user scrolls down, (and based on the association jQuery(window).on('scroll', load_data_from_database); more than one
scroll event is triggered.
Assume two scroll events are triggered right at almost the same moment
Both call the load_data_from_database function
The first event checks if window.lock is false (answer is true, so if statement is correct)
The second event checks if window.lock is false (answer is true, so if statement is correct)
Right away this tells me that you have come to a common (and quite intuitive) misunderstanding.
Javascript is asynchronous, but asynchronous code is not the same thing as concurrent code. As far as I understand, "asynchronous" means that a function's subroutines aren't necessarily explored in depth-first order as we would expect in synchronous code. Some function calls (the ones you are calling "ajax") will be put in a queue and executed later. This can lead to some confusing code, but nothing is as confusing as thinking that your async code is running concurrently. "Concurrency" (as you know) is when statements from different functions can interleave with one another.
Solutions like locks and semaphores are not the right way to think about async code. Promises are the right way. This is the stuff that makes programming on the web fun and cool.
I'm no promise guru, but here is a working fiddle that (I think) demonstrates a fix.
load_data_from_database = function () {
// Load data from the database. Only load data if we almost reach the end of the page
if ( jQuery(window).scrollTop() >= jQuery(document).height() - jQuery(window).height() - 300) {
console.log(promise.state());
if (promise.state() !== "pending") {
promise = jQuery.ajax({
type: 'post',
url: '/echo/json/',
data: {
json: { name: "BOB" },
delay: Math.random() * 10
},
success: function (response) {
console.log("DONE");
}
});
}
}
};
var promise = new $.Deferred().resolve();
// The jQuery ready function (start code here)
jQuery(document).ready(function() {
jQuery(window).on('scroll', load_data_from_database);
});
I'm using a global promise to ensure that the ajax part of your event handler is only called once. If you scroll up and down in the fiddle, you will see that while the ajax request is processing, new requests won't be made. Once the ajax request is finished, new requests can be made again. With any luck, this is the behaviour you were looking for.
However, there is a pretty important caveats to my answer: jQuery's implementation of promises is notoriously broken. This isn't just something that people say to sound smart, it is actually pretty important. I would suggest using a different promise library and mixing it with jQuery. This is especially important if you are just starting to learn about promises.
EDIT: On a personal note, I was recently in the same boat as you. As little as 3 months ago, I thought that some event handlers I was using were interleaving. I was stupefied and unbelieving when people started to tell me that javascript is single-threaded. What helped me is understanding what happens when an event is fired.
In syncronous coding, we are used to the idea of a "stack" of "frames" each representing the context of a function. In javascript, and other asynchronous programming environments, the stack is augmented by a queue. When you trigger an event in your code, or use an asynchronous request like that $.ajax call, you push an event to this queue. The event will be handled the next time that the stack is clear. So for example, if you have this code:
function () {
this.on("bob", function () { console.log("hello"); })
this.do_some_work();
this.trigger("bob");
this.do_more_work();
}
The two functions do_some_work and do_more_work will fire one after the other, immediately. Then the function will end and the event you enqueued will start a new function call, (on the stack) and "hello" will appear in the console. Things get more complicated if you trigger an event in your handler, or if you trigger and event in a subroutine.
This is all well and good, but where things start to get really crappy is when you want to handle an exception. The moment you enter asynchronous land, you leave behind the beautiful oath of "a function shall return or throw". If you are in an event handler, and you throw an exception, where will it be caught? This,
function () {
try {
$.get("stuff", function (data) {
// uh, now call that other API
$.get("more-stuff", function (data) {
// hope that worked...
};
});
} catch (e) {
console.log("pardon me?");
}
}
won't save you now. Promises allow you to take back this ancient and powerful oath by giving you a way to chain your callbacks together and control where and when they return. So with a nice promises API (not jQuery) you chain those callbacks in a way that lets you bubble exceptions in the way you expect, and to control the order of execution. This, in my understanding, is the beauty and magic of promises.
Someone stop me if I'm totally off.
I would recommend a queue which only allows one item to be running at a time. This will require some modification (though not much) to your critical function:
function critical(arg1, arg2, completedCallback) {
$.ajax({
....
success: function(){
// normal stuff here.
....
// at the end, call the completed callback
completedCallback();
}
});
}
var queue = [];
function queueCriticalCalls(arg1, arg2) {
// this could be done abstractly to create a decorator pattern
queue.push([arg1, arg2, queueCompleteCallback]);
// if there's only one in the queue, we need to start it
if (queue.length === 1) {
critical.apply(null, queue[0]);
}
// this is only called by the critical function when one completes
function queueCompleteCallback() {
// clean up the call that just completed
queue.splice(0, 1);
// if there are any calls waiting, start the next one
if (queue.length !== 0) {
critical.apply(null, queue[0]);
}
}
}
UPDATE: Alternative solution using jQuery's Promise (requires jQuery 1.8+)
function critical(arg1, arg2) {
return $.ajax({
....
});
}
// initialize the queue with an already completed promise so the
// first call will proceed immediately
var queuedUpdates = $.when(true);
function queueCritical(arg1, arg2) {
// update the promise variable to the result of the new promise
queuedUpdates = queuedUpdates.then(function() {
// this returns the promise for the new AJAX call
return critical(arg1, arg2);
});
}
Yup, the Promise of cleaner code was realized. :)
You can wrap the critical section in a function and then swap the function so it does nothing after first run:
// this function does nothing
function noop() {};
function critical() {
critical = noop; // swap the functions
//do your thing
}
Inspired by user #I Hate Lazy Function in javascript that can be called only once

Wait for function to finish before executing the rest

When the user refreshes the page, defaultView() is called, which loads some UI elements. $.address.change() should execute when defaultView() has finished, but this doesn't happen all the time. $.address.change() cannot be in the success: callback, as it's used by the application to track URL changes.
defaultView();
function defaultView() {
$('#tout').fadeOut('normal', function() {
$.ajax({
url: "functions.php",
type: "GET",
data: "defaultview=true",
async: false,
success: function (response) {
$('#tout').html(response).fadeIn('normal');
}
});
});
}
$.address.change(function(hash) {
hash = hash.value;
getPage(hash);
});
I'm at a loss as to how to make $.address.change() wait for defaultView() to finish. Any help would be appreciated.
Call it in the success or complete callback. Using delay for timing a callback is unreliable at best. You might even need to put the call to it in the callback to the fadeIn function inside of the success callback.
It doesn't have to be defined inside the success callback, just executed. Both contexts will still be able to use it.
I too was told that because of async you can't make javascript "wait" -- but behold an answer :D ...and since you're using jQuery, all the better:
use jQuery's bind and trigger. I posted an answer to a similar problem at How to get a variable returned across multiple functions - Javascript/jQuery
One option is to hide the $.address (I'm guessing this is a drop-down list) via css, and show it inside the success callback from the ajax method.

Javascript: Trigger action on function exit

Is there a way to listen for a javascript function to exit? A trigger that could be setup when a function has completed?
I am attempting to use a user interface obfuscation technique (BlockUI) while an AJAX object is retrieving data from the DB, but the function doesn't necessarily execute last, even if you put it at the end of the function call.
Example:
function doStuff() {
blockUI();
ajaxCall();
unblockUI();
};
Is there a way for doStuff to listen for ajaxCall to complete, before firing the unBlockUI? As it is, it processes the function linearly, calling each object in order, then a separate thread is spawned to complete each one. So, though my AJAX call might take 10-15 seconds to complete, I am only blocking the user for just a split-second, due to the linear execution of the function.
There are less elegant ways around this...putting a loop to end only when a return value set by the AJAX function is set to true, or something of that nature. But that seems unnecessarily complicated and inefficient.
However you're accomplishing your Ajax routines, what you need is a "callback" function that will run once it's complete:
function ajaxCall(callback){
//do ajax stuff...
callback();
}
Then:
function doStuff(){
blockUI();
ajaxCall(unblockUI);
}
Your AJAX call should specify a callback function. You can call the unblockUI from within the callback.
SAJAX is a simple AJAX library that has more help on how to do AJAX calls.
There's also another post that describes what you're looking for.
You can do a synchronous xhr. This would cause the entire UI block for the duration of the call (no matter how long it might take).
You need to redesign your program flow to be compatible with asynchronus flow, like specifying a callback function to be called after the response is processed. Check out how Prototype or JQuery or ... accomplishes this.
The answer is simple, you have to call unblockUI() when your ajax request returns the result, using jQuery you can do it like this:
function doStuff(){
blockUI();
jQuery.ajax({
url: "example.com",
type: "POST", //you can use GET or POST
success: function(){
unblockUI();
}
});
}
It sounds to me that you want the user to wait while info is being fetched from the db. What I do when I make an Ajax call for some info from the database is to display an animated gif that says "getting it..." - it flashes continually until the info is retrieved and displayed in the webpage. When the info is displayed, the animated gif is turned off/hidden and the focus is moved to the new info being displayed. The animated gif lets the user know that something is happening.

In jQuery Form, 'success' is being called before 'beforeSubmit' is finished

I'm using the jQuery Form plugin to upload an image. I've assigned a fade animation to happen the beforeSubmit callback, but as I'm running locally, it doesn't have time to finish before the success function is called.
I am using a callback function in my fade(); call to make sure that one fade completes, before the next one begins, but that does not seem to guarantee that the function that's calling it is finished.
Am I doing something wrong? Shouldn't beforeSubmit complete before the ajax call is submitted?
Here's are the two callbacks:
beforeSubmit:
function prepImageArea() {
if (userImage) {
userImage.fadeOut(1500, function() {
ajaxSpinner.fadeIn(1500);
});
}
}
success:
function imageUploaded(data) {
var data = evalJson(data);
userImage.attr('src', data.large_thumb);
ajaxSpinner.fadeOut(1500, function() {
userImage.fadeIn(1500);
});
}
I think you may be getting too fancy with those fade animations :)... In the beforeSubmit the fadeOut is setup but the function returns immediately causing the submit to happen. I guess the upload is happening under 3 seconds causing the new image to appear before your animations are complete.
So if you really really want this effect, then you will need to do the image fadeout, spinner fadein, and once that is complete triggering the upload. Something like this:
if (userImage) {
userImage.fadeOut(1500, function() {
ajaxSpinner.fadeIn(1500, function(){
//now trigger the upload and you don't need the before submit anymore
});
});
}
else {
// trigger the upload right away
}
Even though the beforeSubmit callback is called before submitting the form, the userImage.fadeOut function is synchronous (i.e. it spawns a separate thread of execution to execute the fade animation then it continues execution) and it returns immediately. The fade animation takes 1.5 seconds to complete and as you are running on localhost the ajax response is returned faster than 1.5 seconds and thus you won't see the animation, in real world applications it mostly unlikely that ajax requests would take less than 1.5 seconds, so you are good :)

Categories

Resources