I have a search page whose results are rendered in a SlickGrid. It is an ajax search that executes onkeyup, so it's possible for a search to be performed, the Slick.Grid instance's render to be called, and have another result come back before the first asynchronous render completes. I'd like to cancel the initial render as soon as the second ajax request comes back so that there aren't two render calls taking place at the same time.
EDIT WITH EXAMPLE:
Here's what I'm doing, with alerts in place to track the execution order.
function setupGrid() {
slickDataView = new Slick.Data.DataView();
slickGrid = new Slick.Grid(slickGridDiv, slickDataView.rows, slickGridColumns, slickGridOptions);
slickDataView.onRowsChanged.subscribe(function(rows) {
slickGrid.removeRows(rows);
slickGrid.updateRowCount();
slickGrid.render();
});
slickDataView.onRowCountChanged.subscribe(function(args) {
slickGrid.updateRowCount();
slickGrid.render();
});
}
function performSearch() {
jQuery.get('searchPage.php', {MODEL_ID: userInputField.val()},
function(results) {
slickDataView.beginUpdate();
alert(1);
slickDataView.setItems(results);
alert(2);
slickDataView.endUpdate();
}
);
}
setupGrid();
userInputField.keyup(function() { performSearch(); });
I get the following alerts in this sequence when I type two numbers into the userInputField text field in quick succession:
1
1
2
There must be something else going on on your page.
The example you listed is impossible - JavaScript execution is never interrupted by an event getting fired. The event just gets queued up and picked up by the event loop after the current code is done executing. What you would see in your example, is 1212.
You will want to throttle the AJAX calls and also to cancel the callbacks from the previous calls since your AJAX responses may come back out of order and the older search results can override the newer ones.
Related
I came across a peculiar issue when trying to make an ajax call and isolate the actions of the function to itself. Here is the code snippet
$(document).on('click', 'input.action', function(event) {
var self = this;
$.ajax({
url:'http://date.jsontest.com/',
method:'GET',
cache:false,
dataType:'json',
success:self.process,
error:function(){self.process(false);}
});
self.process = function(data) {
if (data) {
alert(data.time);
}
else {
alert("Operation Failed!");
}
}
});
<script src="https://ajax.googleapis.com/ajax/libs/jquery/1.11.0/jquery.min.js"></script>
<div class="container">
<input type="button" value="Get Time" class="action"/>
</div>
Let me briefly explain what I am trying to do, on click of the button, I wish to receive some data from the server and display the data. I am using a process function to process the received data. In case of an error I reuse the process function in a different way to display the error message. I simply use the self variable to contain all the elements within the parent function. I fully understand the following:
What I know
I do not have to use the self to contain the process function because another method will not have access to it
Because the process method in the snippet above is declared after the ajax call for the program as far as it is concerned the process function is undefined.
I clearly know how to fix it.
Experiment:
Click on the Get Time button
Wait for as long as you want but see no result, which is expected because of the process function is declared after the ajax call
Click on the Get Time button again
It works now! Some time (which is probably not your time :P) is displayed now!!
What I wish to know:
What just happened? why does it work the second time and everytime after? Remember this only works for ajax calls, if it were the case that the assignment is retained in the function after calling it once, then this should work in every situation but it does not. Here is an experiment to show that it does not work the same way when ajax calls are not used: Fiddle - Experiment
The Solution:
I am adding a sample solution based on #Felix Kling's answer below. In the Sample Solution, there are two buttons Get Time and Get Date. I've attached the parameter to retrieve time in case of Get Time and date in the case of Get Date to the object self and it is interesting that once I click on Get Time nothing happens just like before but if I click on either Get Time or Get Date the second time only time is displayed.
What just happened?
In a simplified way, this:
var process;
// First click
ajaxCall(process); // process is still undefined
process = function() { ... };
// second click
ajaxCall(process); // process is defined
process = function() { ... };
The assignment to self.process "persists" between events because self refers to the same element.
Here is an experiment to show that it does not work the same way when ajax calls are not used: ...
It doesn't work in your fiddle because of one big difference: you are trying to execute process immediately. But it doesn't exist, so an error is thrown. At this point the script terminates and won't execute the rest of the function, where the function definition takes place.
In your example here, the execution of process is delayed. The script "doesn't know" that there is no function to call until the response was received.
There is nothing "special" going on here. Yes, accepted that the first time it returns undefined as it is yet to be defined.
But, you're attaching the function to the same element input.action. So, the next time, when you click the button, this already has the process method attached to it, so gets called when clicked again.
Try adding one more button with same class and click each once. Now though you've clicked the first button, clicking the second button will still not create an alert as it has not yet had the process function attached to it.
In Ext.tree.TreePanel, when we load the tree, there in no event to check if ALL the tree-nodes are completely loaded.
What we do is, we recursively make asynchronous calls and let the node expands per the node having expanded property to true. How can we find all the asyc. calls has completed and the treePanel is loaded completly ?
The idea behind is when the nodes of treepanel completely loaded then we have to enable a button representing that the tree is available for end user for further operations.
Ext.Ajax.request(...) returns an object and if that object contains an xhr property the request is not completed.
var req = Ext.Ajax.request(..);
if(!req.xhr)
//request is finished
else
// request is not finished
Though i would recommend to return a total count of your nodes with every request. So i'd return a json like this:
{
data: [...], // treenodes
totalCount: 100
}
now you can check in every request success function if your treestore already contains all nodes.
thanks for your input #JuHwon. However, your was not suited well in that scenario. Because the Ajax calls were out of the box, As I mentioned the tree triggers call based on a property(may be isSelectable = true, as far as I can recall now). So, for fulfilling this use case I have used setTimeOut and clearTimeOut method for dynamically push a delay of 2000 ms. I didn't find more closer to it.
I have a search box on my web page that has check boxes in order for the user to filter their results. Only one check box can be checked at once.
When a check box is clicked my code runs off and applies the filter to the list and returns the correct results.
The problem I have is that when a check box is clicked multiple times in quick succession, it queues the requests and pulls them back one by one. This can take a while if a check box is checked and then un-checked multiple times.
Is there any way in Javascript to inform the function that it has been called again and it should stop everything other than this last request?
You want to wrap your onclick callback in a debouncing function like
http://underscorejs.org/#debounce
Say you have this
function search() {
// ...
}
$jquery(".myFilterCheckboxes").click(search);
You should be able to just change the above to:
// Only allow one click event / search every 500ms:
$jquery(".myFilterCheckboxes").click(_.debounce(search, 500));
There are tons of debouncing functions out there, and writing your own isn't a big deal really if you can't or don't want to include underscore.js.
My first thought was towards debouncing because you mentioned multiple clicks creating multiple events in a short period. Debouncing is used really often for things like type-ahead search or autocomplete to provide a little space between key presses for thinking time.
As others have mentioned it may make more sense to simply disable the checkboxes / click event while your search is running. In that case, try something like this:
function disableClick(elem) {
elem.unbind("click");
elem.attr("disabled", true);
}
function enableClick(elem, onclick) {
// Enable click events again
elem.live("click", search);
// Enable the checkboxes
elem.removeAttr("disabled");
}
function search() {
var boxes = $jquery(".myFilterCheckboxes");
disableClick(boxes);
$.get(...).always(function() {
enableClick(boxes, search);
});
}
$jquery(".myFilterCheckboxes").live("click", search);
Why disable the click event, add the disabled attribute to the checkboxes instead of just a global lock variable? Well, global locks can be somewhat error prone, but more than that, we already have a global object that matters in the DOM. If we just modify the DOM state we get the right behavior and signal to our users that they should chill out on the checkboxes until the search completes.
That said, it probably makes sense with any kind of locking / unbinding scenario to indicate to the user with a loading spinner or something that you're doing work.
You can use a lock pattern:
http://jsfiddle.net/RU6gL/
HTML
<input type="checkbox" onclick="fire()" >CB1
<br />
<input type="checkbox" onclick="fire()" >CB2
JS
function_lock = false
fire = function() {
// First, check the lock isn't already reserved. If it is, leave immediately.
if (function_lock) return;
// We got past the lock check, so immediately lock the function to
// stop others entering
function_lock = true;
console.log("This message will appear once, until the lock is released")
// Do your work. I use a simple Timeout. It could be an Ajax call.
window.setTimeout(function() {
// When the work finishes (eg Ajax onSuccess), release the lock.
function_lock = false;
}, 2000);
}
In this example, the function will only run once, no matter how many times the checkboxes are clicked, until the lock is released after 2 seconds by the timeout.
This pattern is quite nice, because it gives you control opver when you release the lock, rather than relying on a timed interval like 'debounce'. For example, it will work with Ajax. If your checkbox is triggering an Ajax call to do the filtering, you can:
On first click, set the lock
Call the Ajax endpoint. Subsequent clicks won't call the Ajax endpoint.
In the Ajax success function, reset the lock.
The checkboxes can now be clicked again.
HTML
<input type="checkbox" onclick="doAjax()" >CB2
JS
ajax_lock = false
doAjax: function() {
// Check the lock.
if (ajax_lock) return;
// Acquire the lock.
ajax_lock = true;
// Do the work.
$.get("url to ajax endpoint", function() {
// This is the success function: release the lock
ajax_lock = false;
});
}
The issue here is that the checkbox is repeatedly clicked on. You should instead disable your checkbox(which would also disable the click event on the element) when you are processing and then re-enable your checkbox when you're done processing.
The debouncing is a great idea, but you don't always know how long it will take for your processing function to finish.
Here's a simple example using jquery promise to re-enable the checkbox after some processing
http://jsfiddle.net/94coc8sd/
with the following code:
function processStuff() {
var dfd = $.Deferred();
// do some processing, when finished,
// resolve the deferred object
window.setTimeout(function(){
dfd.resolve();
}, 2000);
return dfd.promise();
}
function startProcessing() {
$('#processingCheckbox').attr('disabled', 'disabled');
var promise = processStuff();
promise.done(enableCheckbox);
}
function enableCheckbox() {
$('#processingCheckbox').removeAttr('disabled');
}
$('#processingCheckbox').on('click', startProcessing);
I will try to explain my actual setup, the idea behind it, what breaks, what I've tried around it.
The context
I have a PHP5.3 backend feeding "events" (an event being a standard array containing some data, among which a unique sequential number) to Javascript (with jQuery 1.7.x). The events are retrieved using jsonp (on a subdomain) and long-polling on the server side. The first event has the id 1, and then it increments with each new event. The client keeps track of the "last retrieved event id", and that value starts at 0. With each long-polling request, it provides that id so the backend only returns events that occurred after that one.
Events are processed in the following manner: Upon being received (through the jsonp callback), they are stored in an eventQueue variable and "the last retrieved event id" is updated to the one of the last event received and stored in the queue. Then a function is called that processes the next queued event. That function checks whether an event is already being processed (through the means of another variable that is set whenever an event is starting to get processed), if there is it does nothing, so the callstack brings us back to the jsonp callback where a new long-polling request is emitted. (That will repeat the process of queueing new events while the others are processed) However, if there is no event currently being processed, it verifies if there are events left in the queue, and if so it processes the first one (the one with the lowest id). "Processing an event" can be various tasks pertinent to my application, but not to the problem I have or to the context. For example, updating a variable, a message on the page, etc. Once an event is deemed "done being processed" (some events make an ajax call to get or send data, in which case this happens in their success ajax callback), a call to a another function called eventComplete is made. That function deletes the processed event from the event queue, makes sure the variable that handles whether an event is being processed is set to false, and then calls the function that processes the event queue. (So it processes the next, lowest id, event)
The problem
This works really well, on all tested major browsers too. (Tested on Internet Explorer 8 and 9, Chrome, Opera, Firefox) It also is very snappy due to the utilization of long polling. It's also really nice to get all the "history" (most events generate textual data that gets appended in a sort of console in the page) of what has happened and be in the exact same state of the application, even after reloading the page. However, this also becomes problematic when the number of events gets high. Based on estimates, I would need to be able handle as many as 30,000 events. In my tests, even at 7,000 events things start to go awry. Internet Explorer 8 stack overflows around 400 events. Chrome doesn't load all events, but gets close (and breaks, not always at the same point however, unlike IE8). IE9 and FF handle everything well, and hang 2-3 seconds while all events are processed, which is tolerable. I'm thinking however that it might just be a matter of some more events before they break as well. Am I being just too demanding of current web browsers, or is there something I got wrong? Is there a way around that? Is my whole model just wrong?
Possible solutions
I fiddled around with some ideas, none of which really worked. I tried forcing the backend to not output more than 200 events at a time and adding the new poll request after all the current queue was done processing. Still got a stack overflow. I also tried deleting the eventQueue object after it's done processing (even though it is empty then) and recreating it, in the hope that maybe it would free some underlying memory or something. I'm short on ideas, so any idea, pointer or general advice would be really appreciated.
Edit:
I had an enlightenment! I think I know exactly why all of this is happening (but I'm still unsure on how to approach it and fix it), I will provide some basic code excerpts too.
var eventQueue = new Object();
var processingEvent = false;
var lastRetrievedEventId = 0;
var currentEventId = 0;
function sendPoll() {
// Standard jsonp request (to a intentionally slow backend, i.e. long-polling),
// callback set to pollCallback(). Provide currentEventId to the server to only get
// the events starting from that point.
}
function pollCallback( data ) {
// Make sure the data isn't empty, this happens if the jsonp request
// expires (30s in my case) and it didn't get any new data.
if( !jQuery.isEmptyObject( data ) )
{
// Add each new event to the event queue.
$.each(data.events, function() {
eventQueue[ this.id ] = this;
lastRetrievedEventId = this.id; // Since we just put the event in the queue, we know it is officially the last one "retrieved".
});
// Process the next event, we know there has to be at least one in queue!
processNextEvent();
}
// Go look for new events!
sendPoll();
}
function processNextEvent() {
// Do not process events if they are currently being processed, that would happen
// when an event contains an asynchronous function, like an AJAX call.
if( !processingEvent )
{
var nextEventId = currentEventId + 1;
// Before accessing it directly, make sure the "next event" is in the queue.
if( Object.prototype.hasOwnProperty.call(eventQueue, nextEventId) )
{
processingEvent = true;
processEvent( eventQueue[ nextEventId ] );
}
}
}
function processEvent( event ) {
// Do different actions based on the event type.
switch( event.eventType ) {
case SOME_TYPE:
// Do stuff pertaining to SOME_TYPE.
eventComplete( event );
break;
case SOME_OTHER_TYPE:
// Do stuff pertaining to SOME_OTHER_TYPE.
eventComplete( event );
break;
// Etc. Many more cases here. If there is an AJAX call,
// the eventComplete( event ) is placed in the success: callback
// of that AJAX call, I do not want events to be processed in the wrong order.
}
}
function eventComplete( event ) {
// The event has completed, time to process the event after it.
currentEventId = event.id; // Since it was fully processed, it is now the most current event.
delete eventQueue[ event.id ]; // It was fully processed, we don't need it anymore.
processingEvent = false;
processNextEvent(); // Process the next event in queue. Most likely the source of all my woes.
}
function myApplicationIsReady() {
// The DOM is fully loaded, my application has initiated all its data and variables,
// start the long polling.
sendPoll();
}
$(function() {
// Initializing my application.
myApplicationIsReady();
});
After looking at things, I understood why the callstack gets full with many events. For example (-> meaning calls):
myApplicationIsReady() -> sendPoll()
And then when getting the data:
pollCallback() -> [ processNextEvent() -> processEvent() -> eventComplete() -> processNextEvent() ]
The part in brackets is the one that loops and causes the callstack overflow. It doesn't happen with a low amount of events because then it does this:
pollCallback() -> processNextEvent() -> processEvent() -> eventComplete() -> sendPoll()
That would be with two events, and the first one containing an asynchronous call. (So it gets to the second event, which doesn't get processed because the first one isn't done processing, instead it calls the polling function, which then frees the whole callstack and eventually the callback from that will resume the activity)
Now it is not easy to fix and it was designed like that in the first place, because:
I do not want to lose events (As in, I want to make sure all events are processed).
I do not want to hang the browser (I can't use synchronous AJAX calls or an empty loop waiting for something to finish).
I absolutely want events to get processed in the right order.
I do not want for events to get stuck in the queue and the application not processing them anymore.
That is where I need help now! To do what I want it sounds like I need to use chaining, but that is exactly what is causing my callstack issues. Perhaps there is a better chaining structure that lets me do all that, without going infinitely deep in the callstack and I might have overlooked it. Thank you again in advance, I feel like I'm making progress!
How about instead of calling functions recursively, use setTimeout(func, 0)?
I am using the pub / sub jQuery plugin by Peter Higgins. I have run into a problem with JavaScript validation.
This is the crux of the problem...
$.subscribe('/make', function(form_id, fields, path, type) {
for (var i=0; i < fields.length; i++) {
$.publish('/validation/field', [ path+'check_field', $('*[name="'+fields[i]+'"]'), fields ]);
}
if ($('#'+form_id+' .error').length > 0) {
alert('There are errors, please fix the errors before continuing.');
return;
}
The /validation/field will append errors to form fields. When you run this the first time the errors appear but everything is running so quickly an ajax request is sent to save the form. When the form is run the second time the function is stopped correctly as the error classes have been counted.
Is there a way around this?
If the validation is asynchronous
...the usual way you deal with this is to use a callback. In this case, you'd be looking for a callback defined by the validation that tells you when the validation has finished, at which point you could use your code for counting the resulting errors.
If the validation is synchronous
The above assumes that the validation involves an asynchronous activity of some kind. It doesn't look like that pub/sub plugin provides any asynchronicity, so this would be down to what your subscriber for /validation/field does. If it does an ajax call or a setTimeout or similar, then you'll need a callback.
If the /validation/field subscriber does the validation synchronously, then I'm surprised you're having a problem with the fields not being counted correctly afterward. But if you are, you can probably solve it by giving the browser just a moment to breathe:
$.subscribe('/make', function(form_id, fields, path, type) {
for (var i=0; i < fields.length; i++) {
$.publish('/validation/field', [ path+'check_field', $('*[name="'+fields[i]+'"]'), fields ]);
}
// Check results *after* giving the browser a moment to breathe:
setTimeout(function() {
if ($('#'+form_id+' .error').length > 0) {
alert('There are errors, please fix the errors before continuing.');
return;
}
// (...your code left off here, but I'm guessing the rest of the
// function would also need to be within this new anonymous function...)
}, 0);
}
The 0 parameter to setTimeout means "call me back in zero milliseconds", but no browser will actually do that — it'll typically be 4-10 milliseconds later. The point is that it will be asynchronous, after the browser has had a chance to catch up and re-invoke the JavaScript layer.
Be careful though, make sure that the thing doing the validating is doing it synchronously. If it isn't, if it's doing anything async, then the above introduces a race condition into your code where sometimes it will seem to work, other times it won't work. Race conditions will bite you, so double-check. :-)
Dows the subscribe use ajax to call the server?
If wo then thats is your problem as ajax is asyncornious.
The subscribe method then exits immediatly after sending the call, and the function is called on ajax response.
This means that any code after the subscribe is called probably before the ajax request finishes and there will not be any errors yet.
You need then to move aany code folowing the subscribe call into the callback methos instead to be sure it is called after the ajax call.
To prevent any other action you could set a globalvariable, ex. validating = true, and have any other code check that.