Delay required in between the execution of two statements - javascript

i am working with extjs 4.2 and at one place i am loading the store object like this :
var userDetailStore = Ext.create('Ext.data.Store', {
model : 'Person.DetailsModel',
autoLoad : true,
proxy : {
type : 'ajax',
method : 'POST',
url : 'getValueAction.action',
reader : {
type : 'json',
root : 'details'
},
writer : {
type : 'json',
root : 'details'
}
},
fields : ['id','loginName','referenceId' ,'name']
});//Here I load the store which will definitely contain a list of values.
and in the very next line i want to get the referenceId of the first value from the store object like this
var empId = userDetailStore.getAt(0).get('referenceId')
and i am getting the error because till now the getCount() of the store object userDetailStore is giving me zero.But if i write an alert statement like alert('loading data'); before the line where i am getting the referenceId then the code works fine.The line userDetailStore.getCount() is giving me the exact value.
So i think some kind of delay is required between the loading the store and then using the store but I don't want an alert to show.I have even used the sleep() method in place of alert statement.But that is also not working.(BTW i don't even want to freeze the browser by executing the sleep())
Am i doing anything wrong while loading the store ?Is there any general way so that i will execute my code for using the store after the store is completely loaded ?
Somebody please help me out here...
Regards :
Dev

Vijay's answer is correct, but I thought I'd expand on the concept so that it's clear how this answer fits into what you're doing.
It's important to understand that when you make an AJAX request, the request is asynchronous. What this means in practical terms is that (as you found out) the remainder of your calling script does not wait for the asynchronous process to complete. Rather, the moment that you make an asynchronous request, your script is going to continue on it's merry way, executing the very next line of code.
So if you think about it, this makes perfect sense why you were not seeing a "count" in your store. While your async request was in the process of going to the server, getting the result, and then returning it to your request, the rest of your code kept right on executing, oblivious to what was happening in the async request (and this is precisely why async requests are powerful and awesome).
This is also why adding the alert seemed to "fix" your problem. When you call alert(), you literally halt execution of your script at the point of the alert. However, since your request for data was asynchronous, the time it took you to click the "OK" button of the alert (and hence resume processing of your script) gave the async request enough time to complete its lifecycle and update the original calling object.
In light of this, it's understandable why it would seem that a "delay" would be a desirable way to go, since the "delay" (or really, "halting") of the alert fixed your issue (at least on the surface). However, with async requests, you can never really know how long it's going to take to complete. If you have a large response, or there is unusual network latency, or any other number of issues...the hard-coded delay might work, but it also might not. Most maddening of all is that you'd never get consistent results, and would constantly be increasing the "delay" in order to accomodate all the things that could contribute to your async request taking longer and longer.
This is why the load() event of the store (and callbacks in general) is such a critical concept to understand and implement. By listening for the load() event, and then executing what code you need only within the context of that event firing, you can know for sure that the store's async request for data has completed.
If you've not used callbacks and event handling before, it does take a bit of getting used to in order to break out of the linear, procedural mindset. However, when dealing with AJAX requests in general, and event-driven frameworks like ExtJS 4 in particular, it's a concept you need to embrace in order to build effective and consistent applications.

use on load event to get the count after it's fully loaded
userDetailStore.on('load', function(){
alert("Fully loaded");
});

Here set autoload to false and on some action you can use load() to load your store.
store.load({
callback: function(records, operation, success) {
// do something after the load finishes
},
scope: this
});

Related

Delaying a setTimeout()

I'm having an issue with some asynchronous JavaScript code that fetches values from a database using ajax.
Essentially, what I'd like to do is refresh a page once a list has been populated. For that purpose, I tried inserting the following code into the function that populates the list:
var timer1;
timer1 = setTimeout([refresh function], 1000);
As you might imagine, this works fine when the list population takes less than 1 second, but causes issues when it takes longer. So I had the idea of inserting this code into the function called on the success of each ajax call:
clearTimeout(timer1);
timer1 = setTimeout([refresh function], 1000);
So in theory, every time a list element is fetched the timer should reset, meaning that the refresh function should only ever be called 1 second after the final list element is successfully retrieved. However, in execution all that happens is that timer1 is reset once, the first time the second block of code is reached.
Can anybody see what the problem might be? Or if there's a better way of doing this? Thanks.
==========
EDIT: To clear up how my ajax calls work: one of the issues with the code's structure is that the ajax calls are actually nested; the callback method of the original ajax call is itself another ajax call, whose callback method contains a database transaction (incorrect - see below). In addition, I have two such methods running simultaneously. What I need is a way to ensure that ALL calls at all levels have completed before refreshing the page. This is why I thought that giving both methods one timer, and resetting it every time one of the callback methods was called, would keep pushing its execution back until all threads were complete.
Quite honestly, the code is very involved-- around 140 lines including auxiliary methods-- and I don't think that posting it here is feasible. Sorry-- if no one can help without code, then perhaps I'll bite the bullet and try copying it here in a format that makes some kind of sense.
==========
EDIT2: Here's a general workflow of what the methods are trying to do. The function is a 'synchronisation' function, one that both sends data to and retrieves data from the server.
I. Function is called which retrieves items from the local database
i. Every time an item is fetched, it is sent to the server (ajax)
a. When the ajax calls back, the item is updated locally to reflect
its success/failure
II. A (separate) list of items is retrieved from the local database
i. Every time an item is fetched, an item matching that item's ID is fetched from the server (ajax)
a. On successful fetch from server, the items are compared
b. If the server-side item is more recent, the local item is updated
So the places I inserted the second code block above are in the 'i.' sections of each method, in other words, where the ajax should be calling back (repeatedly). I realize that I was in error in my comments above; there is actually never a nested ajax call, but rather a database transaction inside an ajax call inside a database transaction.
You're doing pretty well so far. The trick you want to use is to chain your events together, something like this:
function refresh()
{
invokeMyAjaxCall(param1, param2, param3, onSuccessCallback, onFailureCallback);
}
function onSuccessCallback()
{
// Update my objects here
// Once all the objects have been updated, trigger another ajax call
setTimeout(refresh, 1000);
}
function onFailureCallback()
{
// Notify the user that something failed
// Once you've dealt with the failures, trigger another call in 1 sec
setTimeout(refresh, 1000);
}
Now, the difficulty with this is: what happens if a call fails? Ideally, it sounds like you want to ensure that you are continually updating information from the server, and even if a temporary failure occurs you want to keep going.
I've assumed here that your AJAX library permits you to do a failure callback. However, I've seen some cases when libraries hang without either failing or succeeding. If necessary, you may need to use a separate set of logic to determine if the connection with the server has been interrupted and restart your callback sequence.
EDIT: I suspect that the problem you've got is a result of queueing the next call before the first call is done. Basically, you're setting up a race condition: can the first call finish before the next call is triggered? It may work most times, or it may work once, or it may work nearly all the time; but unless the setTimeout() is the very last statement in your "response-processing" code, this kind of race condition will always be a potential problem.

Delay script until all messages have been passed?

As usual, I have Googled this a fair bit and read up on the Message Passing API, but, again, I've had to resort to the good fellas at StackOverflow.
My question is this: When passing messages between a Google Chrome extension's background page and content script, is there any way to make it asynchronous - that is, to delay the JavaScript until the messages are detected as being successfully being passed?
I have a function immediately after the message-passing function makes use of the localStorage data that is passed. On first runs the script always results in an error, due to the data not being passed fast enough.
Currently, I'm circumventing this with setTimeout(nextFunction, 250); but that's hardly an elegant or practical solution, as the amount and size of the values passed is always going to change and I have no way of knowing how long it needs to pass the values. Plus, I would imagine, that passing times are relative to the browser version and the user's system.
In short, I need it to be dynamic.
I have considered something like
function passMessages(){
chrome.extension.sendRequest({method: "methodName"}, function(response) {
localStorage["lsName"] = response.data;
});
checkPassedMessages();
}
function checkPassedMessages(){
if (!localStorage["lsName"]){
setTimeout(checkPassedMessages, 100); //Recheck until data exists
} else {
continueOn();
}
}
but I need to pass quite a lot of data (at least 20 values) and, frankly, it's not practical to !localStorage["lsName1"] && !localStorage["lsName2"] etc etc. Plus I don't even know if that would work or not.
Does anyone have any ideas?
Thanks
Update: Still no answer, unfortunately. Can anyone offer any help at all? :/
I don't know whether I'm interpreting your question wrong. As far as I understand you are sending request from your extension page to a content script. The request handler in the content handler does some operation on the message passed after which you need the control back in the extension page. If this is what you need you have everything in the Google Extension Documentation. The following code works
//Passing the message
function passMessages(){
chrome.extension.sendRequest({method: "methodName"}, function(response) {
//callback function that will be called from the receiving end
continueOn();
});
}
//Recieving the message
chrome.extension.onRequest.addListener(
function(request, sender, sendResponse) {
//Do the required operation with the message passed and call sendResponse
sendResponse();
});
You can solve the general (ie, on any platform where you are communicating between distinct threads of execution) case of this problem by collecting the information passed, while waiting for some sort of following "go" message before you begin processing the collected information. You can use the same idea to have the sender wait for the complete reply.
Of course it's possible that your particular platform provides tools for doing this; but if not, you can always build the general solution by hand.

how to silently guarantee executing an ASP.NET MVC3 action on page unload

I need to execute an action of a controller when a user leave a page (close, refresh, go to link, etc.). The action code is like:
public ActionResult WindowUnload(int token)
{
MyObjects[token].Dispose();
return Content("Disposed");
}
On window download I do Ajax request to the action:
$(window).unload(function ()
{
$.ajax({
type: "POST",
url: "#Url.Action("WindowUnload")",
data: {token: "#ViewData["Token"]"},
cache: false,
async: true
});
//alert("Disposing.");
})
The above ajax request does not come to my controller, i.e., the action is not executed.
To make the code above to work I have to uncomment the alert line, but I don't want to fire alert on a user.
If I change async option to false (alert is commented), then it sometimes works. For example, if I refresh the page several times too fast then the action will not be executed for every unload.
Any suggestions how to execute the action on every unload without alert?
Note, I don't need to return anything from action to the page.
Updated: answers summary
It is not possible reliably to do request on unload, since it is not proper or expected behavior on unload. So it is better to redesign the application and avoid doing HTTP request on window unload.
If it is not avoidable, then there are common solutions (described in the question):
Call ajax synchronously, i.e., async: false.
Pros: works in most cases.
Pros: silent
Cons: does not work in some cases, e.g, when a user refreshes the windows several times too fast (observed in Firefox)
Use alert on success or after ajax call
Pros: seems to work in all cases.
Cons: is not silent and fires pop up alert.
According to unload documentation, with async: false it should work as expected. However, this will always be a bit shaky - for example, user can leave your page by killing/crashing the browser and you will not receive any callback. Also, browser implementations vary. I fear you won't get any failproof even.
HTTP is stateless and you can never get a reliable way to detect that the user has left your page.
Suggested events:
Session timeout (if you are using sessions)
The application is going down
A timer (need to be combined with the previous suggestion)
Remove the previous token when a new page is visited.
Why does this need to happen at all?
From the code snippet you posted you are attempting to use this to dispose of objects server side? You are supposed to call Dispose to free up any un-managed resources your objects are using (such as Database connections).
This should be done during the processing of each request. There shouldn't be any un-managed resources awaiting a dispose when the client closes the browser window.
If this is the way you are attempting this in the manner noted above the code needs to be reworked.
Have you tried onbeforeunload()?
$(window).bind('beforeunload', function()
{
alert('unloading!');
}
);
or
window.onbeforeunload = function() {
alert('unloading!');
}
From the comment you made to #Frazzell's answer it sounds like you are trying to manage concurrency. So on the chance that this is the case here are two common method for managing it.
Optimistic concurrency
Optimistic concurrency adds a timestamp to the table. When the client edits the record the timestamp is included in the form. When they post their update the timestamp is also sent and the value is checked to make sure it is the most recent in the table. If it is, the update succeeds. If it is not then someone else got in sooner with an update so it is discarded. How you handle this is then up to you.
Pessimistic concurrency
If you often experience concurrency clashes then pessimistic concurrency may be better. Here when the client edits the record a flag is set on that row to lock it. This will remain until the client completes the edit and no other user can edit that row. This method avoids users loosing changes but add an administration over head to the application. Now you need a way to release unwanted locks. You also have to inform the user through the UI that a row is locked for edit.
In my experience it is best to start with optimistic concurrency. If I have lots of people reporting problems I will try to find out why people are having these conflicts. It maybe that I have to break down some entities in to smaller types as they have become responsible for doing too many jobs.
This wont work and even if you are able to somehow make it work it will give you lots of headaches later on, because this is not how the browser/HTTP is supposed to be used. When the page is unloading (in browser) the browser will call the unload event and then unload the page (you cannot make it wait, not even my making sync ajax calls) and in case the call was going on and the browser after executing the code unload the page, the call will also get cancelled and thats why you see the call on server sometimes and sometimes it doesn't work. If you could tell use why you want to do this we could suggest you a better approach.
You can't. The only thing you can do is prompt the user to stay and hope for the best. There are a whole host of security concerns here.

JQUERY or JS is there a way to detect anytime the Window is loading? Basically any network activity?

Is there a way with JQUERY or Javascript, to detect anytime the browser window is loading something, making an ajax call, loading an image, etc... Basically any network activity?
Something along the lines of this jQuery (put in the HEAD) should work.
var loading = true;
$(window).load(function () {
loading = false;
}).ajaxStart(function () {
loading = true;
}).ajaxComplete(function () {
loading = false;
});
Read up on ajaxStart for more details.
My answer below and the previous users' answers are only attempting to check if the browser is currently making an XmlHttpRequest (i.e. ajaxy) request.
You asked if you could tell if the browser was making any network request (i.e. downloading images/css or perhaps a long running 'comet' request). I don't know of any javascript API that would tell you this - they may exist but I suspect that if they do then they would be browser specific [if anyone out there knows the answer to this please chip in]. Obviously tools like Firebug and dynaTrace can detect network activity but I think these tool "hook in" to the browser a lot deeper down than javascript code.
Having said all that, if you want to count XHRs spawned by jQuery then dave1010's answer seems like a good one.
However I think it is prone to some race condition problems : -
According to the docs (http://api.jquery.com/ajaxStart/)
Whenever an Ajax request is about to be sent, jQuery checks whether there are any other outstanding Ajax requests. If none are in progress, jQuery triggers the ajaxStart event. Any and all handlers that have been registered with the .ajaxStart() method are executed at this time.
So if a long running XHR was started and another one was started before the first had completed, the ajaxStart handler would only be called once. The 2nd XHR could complete before the first one and set loading = false, even though the first request is still in progress.
Looking at the jQuery 1.4 source seems to confirm this. Interestingly, jQuery 1.4 has a count of active XHRs (jQuery.active) but this is not mentioned in the docs and so it is probably best not to use it (which is a pity because it would make life a bit easier).
See http://gist.github.com/277432#LID5052 for the code that checks to see that $.active is 0 before invoking ajaxStart handlers.
[I think] The 'ajaxSend' global event handlers are called before each and every XHR. Using this in preference to 'ajaxStart' should help but we will still need to keep a count of active requests as opposed to a simple "loading" flag.
Perhaps something like the following will work?
var activeXhrCount = 0;
$(document).ajaxSend(function() {activeXhrCount++;}).ajaxComplete(function(){activeXhrCount--;});
Be aware that this will give incorrect answers if any code calls $.ajax with the global option set to false and so is hardly bullet proof;
Also keep in mind that activeXhrCount only counts XHRs spawned by jQuery - those from other libraries that you may utilize will not be counted.
i think you have to build some think by your self,
i did something like this before [ like when gmail shows loading upper there ]
the idea is about making an array then add/remove from it and keep checking it for know if there is an open connection
the code should look like this
var liveAjax = [];
//making ajax call
liveAjax[liveAjax.length] = true;
$.ajax({
url: 'ajax/test.html',
success: function(data) {
liveAjax[liveAjax.length] = false;
or delete liveAjax[liveAjax.length]
}
});
then checking the alive calls by
setInterval(function(){
//looping in liveAjax and check if there any true value, then there is ajax call elese nothing happens
},200);

Any issue with setTimeout calling itself?

I have read a handful of setTimeout questions and none appear to relate to the question I have in mind. I know I could use setInterval() but it is not my prefered option.
I have a web app that could in theory be running a day or more without a page refresh and more than one check a minute. Am I likely to get tripped up if my function calls itself several hundred (or more) times using setInterval? Will I reach "stack overflow" for example?
If I use setInterval there is a remote possibility of two requests being run at the same time especially on slow connections (where a second one is raised before the first is finished). I know I could create flags to test if a request is already active, but I'm afraid of false positives.
My solution is to have a function call my jquery ajax code, do its thing, and then as part of ajaxComplete, it does a setInterval to call itself again in X seconds. This method also allows me to alter the duration between calls, so if my server is busy(slow), one reply can set a flag to increase the time between ajax calls.
Sample code of my idea...
function ServiceOrderSync()
{
// 1. Sync stage changes from client with the server
// 2. Sync new orders from server to this client
$.ajax( { "data": dataurl,
"success": function(data)
{ // process my data },
"complete": function(data)
{
// Queue the next sync
setTimeout( ServiceOrderSync, 15000 );
});
}
You won't get a stack overflow, since the call isn't truly recursive (I call it "pseudo-recursive")
JavaScript is an event driven language, and when you call setTimeout it just queues an event in the list of pending events, and code execution then just continues from where you are and the call stack gets completely unwound before the next event is pulled from that list.
p.s. I'd strongly recommend using Promises if you're using jQuery to handle async code.

Categories

Resources