Firefox not starting onreadystatechange function - javascript

I made some javascript code for my website, it works without problem on opera and chrome, but not on firefox.
Here is script:
function checkstate(who,row,cell) {
var zwrot="";
var mouseEvent='onmouseover="javascript:bubelon(this.id);" onmouseout="bubeloff();"';
var cellid="";
ajax=new XMLHttpRequest();
ajax.onreadystatechange=function(aEvt) {
if (ajax.readyState===4 && ajax.status===200) {
alert("im here!");
}
};
ajax.open('GET',"oth/work_prac_stan.php?usr="+who,false);
ajax.send();
}
function sprawdzstan() {
var lol="";
var table = document.getElementById("usery");
var re = /^<a\shref\=/g;
for (var i = 1, row; row = table.rows[i]; i ++) {
if (row.cells[0].innerHTML.match(re)) {
checkstate(row.cells[1].innerHTML,row,2);
} else {
checkstate(row.cells[0].innerHTML,row,1);
}
}
}
The problem is, that firefox is not running function assigned to onreadystatechange. I checked in firebug, that response from php file is correct.
Where is the problem? It works on chrome and opera, firefox just dont, no error in console, nothing.

Updated answer
According to Mozilla's docs, you don't use onreadystatechange with synchronous requests. Which kind of makes sense, since the request doesn't return until the ready state is 4 (completed), though I probably wouldn't have designed it that way.
Original answer
Not immediately seeing a smoking gun, but: Your ajax variable is not defined within the function, and so you're almost certainly overwriting it on every iteration of the loop in sprawdzstan. Whether that's a problem remains to be seen, since you're using a synchronous ajax call. In any case, add a var ajax; to checkstate to ensure that you're not falling prey to the Horror of Implicit Globals.
Off-topic: If you can possibly find a way to refactor your design to not use a synchronous ajax request, strongly recommend doing that. Synchronous requests lock up the UI of the browser (to a greater or lesser degree depending on the browser, but many — most? — completely lock up, including other unrelated tabs). It's almost always possible to refactor and use an asynchronous request instead.
Off-topic 2: You aren't using mouseEvent in your code, but if you were, you would want to get rid of those javascript: prefixes on the onmouseover and onmouseout attributes. Those attributes are not URLs, the prefix is not (there) a protocol specifier (it's a label, which you're not using).

For those who still encounter this problem...
You can use the below code. What I did is remove the function
ajax.onreadystatechange=function(aEvt) {
and transfer the alert("im here!"); after the ajax.send();
ajax=new XMLHttpRequest();
ajax.open('GET',"oth/work_prac_stan.php?usr="+who,false);
ajax.send();
alert("im here!");

Related

How synchronous AJAX call could cause memory leak?

I understand this general advice given against the use of synchronous ajax calls, because the synchronous calls block the UI rendering.
The other reason generally given is memory leak isssues with synchronous AJAX.
From the MDN docs -
Note: You shouldn't use synchronous XMLHttpRequests because, due to
the inherently asynchronous nature of networking, there are various
ways memory and events can leak when using synchronous requests. The
only exception is that synchronous requests work well inside Workers.
How synchronous calls could cause memory leaks?
I am looking for a practical example.
Any pointers to any literature on this topic would be great.
If XHR is implemented correctly per spec, then it will not leak:
An XMLHttpRequest object must not be garbage collected if its state is
OPENED and the send() flag is set, its state is HEADERS_RECEIVED, or
its state is LOADING, and one of the following is true:
It has one or more event listeners registered whose type is
readystatechange, progress, abort, error, load, timeout, or loadend.
The upload complete flag is unset and the associated
XMLHttpRequestUpload object has one or more event listeners registered
whose type is progress, abort, error, load, timeout, or loadend.
If an XMLHttpRequest object is garbage collected while its connection
is still open, the user agent must cancel any instance of the fetch
algorithm opened by this object, discarding any tasks queued for them,
and discarding any further data received from the network for them.
So after you hit .send() the XHR object (and anything it references) becomes immune to GC. However, any error or success will put the XHR into DONE state and it becomes subject to GC again. It wouldn't matter at all if the XHR object is sync or async. In case of a long sync request again it doesn't matter because you would just be stuck on the send statement until the server responds.
However, according to this slide it was not implemented correctly at least in Chrome/Chromium in 2012. Per spec, there would be no need to call .abort() since the DONE state means that the XHR object should already be normally GCd.
I cannot find even slightest evidence to back up the MDN statement and I have contacted the author through twitter.
I think that memory leaks are happening mainly because the garbage collector can't do its job. I.e. you have a reference to something and the GC can not delete it. I wrote a simple example:
var getDataSync = function(url) {
console.log("getDataSync");
var request = new XMLHttpRequest();
request.open('GET', url, false); // `false` makes the request synchronous
try {
request.send(null);
if(request.status === 200) {
return request.responseText;
} else {
return "";
}
} catch(e) {
console.log("!ERROR");
}
}
var getDataAsync = function(url, callback) {
console.log("getDataAsync");
var xhr = new XMLHttpRequest();
xhr.open("GET", url, true);
xhr.onload = function (e) {
if (xhr.readyState === 4) {
if (xhr.status === 200) {
callback(xhr.responseText);
} else {
callback("");
}
}
};
xhr.onerror = function (e) {
callback("");
};
xhr.send(null);
}
var requestsMade = 0
var requests = 1;
var url = "http://missing-url";
for(var i=0; i<requests; i++, requestsMade++) {
getDataSync(url);
// getDataAsync(url);
}
Except the fact that the synchronous function blocks a lot of stuff there is another big difference. The error handling. If you use getDataSync and remove the try-catch block and refresh the page you will see that an error is thrown. That's because the url doesn't exist, but the question now is how garbage collector works when an error is thrown. Is it clears all the objects connected with the error, is it keeps the error object or something like that. I'll be glad if someone knows more about that and write here.
If the synchronous call is interrupted (i.e. by a user event re-using the XMLHttpRequest object) before it completes, then the outstanding network query can be left hanging, unable to be garbage collected.
This is because, if the object that initiated the request does not exist when the request returns, the return cannot complete, but (if the browser is imperfect) remains in memory. You can easily cause this using setTimeout to delete the request object after the request has been made but before it returns.
I remember I had a big problem with this in IE, back around 2009, but I would hope that modern browsers are not susceptible to it. Certainly, modern libraries (i.e. JQuery) prevent the situations in which it might occur, allowing requests to be made without having to think about it.
Sync XHR block thread execution and all objects in function execution stack of this thread from GC.
E.g.:
function (b) {
var a = <big data>;
<work with> a and b
sync XHR
}
Variables a and b are blocked here (and whole stack too).
So, if GC started working then sync XHR has blocked stack, all stack variables will be marked as "survived GC" and be moved from early heap to the more persistent. And a tone of objects that should not survive even the single GC will live many Garbage Collections and even references from these object will survive GC.
About claims stack blocks GC, and that object marked as long-live objects: see section Conservative Garbage Collection in
Clawing Our Way Back To Precision.
Also, "marked" objects GCed after the usual heap is GCed, and usually only if there is still need to free more memory (as collecting marked-and-sweeped objs takes more time).
UPDATE:
Is it really a leak, not just early-heap ineffective solution?
There are several things to consider.
How long these object will be locked after request is finished?
Sync XHR can block stack for a unlimited amount of time, XHR has no timeout property (in all non-IE browsers), network problems are not rare.
How much UI elements are locked? If it block 20M of memory for just 1 sec == 200k lead in a 2min. Consider many background tabs.
Consider case when single sync blocks tone of resources and browser
goes to swap file
When another event tries to alter DOM in may be blocked by sync XHR, another thread is blocked (and whole it's stack too)
If user will repeat the actions that lead to the sync XHR, the whole browser window will be locked. Browsers uses max=2 thread to handle window events.
Even without blocking this consumes lots of OS and browser internal resources: thread, critical section resources, UI resources, DOM ... Imagine that your can open (due to memory problem) 10 tabs with sites that use sync XHR and 100 tabs with sites that use async XHR. Is not this memory leak.
Memory leaks using syncronous AJAX requests are often caused by:
using setInterval/setTimout causing circular calls.
XmlHttpRequest - when the reference is removed, so xhr becomes inaccessible
Memory leak happens when the browser for some reason doesn’t release memory from objects which are not needed any more.
This may happen because of browser bugs, browser extensions problems and, much more rarely, our mistakes in the code architecture.
Here's an example of a memory leak being cause when running setInterval in a new context:
var
Context = process.binding('evals').Context,
Script = process.binding('evals').Script,
total = 5000,
result = null;
process.nextTick(function memory() {
var mem = process.memoryUsage();
console.log('rss:', Math.round(((mem.rss/1024)/1024)) + "MB");
setTimeout(memory, 100);
});
console.log("STARTING");
process.nextTick(function run() {
var context = new Context();
context.setInterval = setInterval;
Script.runInContext('setInterval(function() {}, 0);',
context, 'test.js');
total--;
if (total) {
process.nextTick(run);
} else {
console.log("COMPLETE");
}
});

IE, XDomainRequest not always work

I am trying to do cross-domain on IE.
I used XDomainRequest, and implanted logging for all events (onerror, onload, onprogress and ontimeout) to monitor the progress.
It works sometime, but not always (one computer, IE9, same site, same request, 1 out of 3 or 4 works; another computer, IE8, maybe 1 out of 2 works). I didn't get any useful information from the logging, because there was nothing triggered.
I am very confused. Any debugging tool for IE? Why some time XDomainRequest just doesn't work?
Thanks a lot
coronin
There are at least two significant bugs in the XDomainRequest object, one that affects IE8 and another that affects IE9.
Issue 1 - Garbage Collection
In Internet Explorer 8, the XDomainRequest object is incorrectly subject to garbage collection after send() has been called but not yet completed. The symptoms of this bug are the Developer Tools' network trace showing "Aborted" for the requests and none of the error, timeout, or success event handlers being called.
Typical AJAX code looks a bit like this:
function sendCrossDomainAjax(url, successCallback, errorCallback) {
var xdr = new XDomainRequest();
xdr.open("get", url);
xdr.onload = function() { successCallback(); }
xdr.onerror = function() { errorCallback(); }
xdr.send();
}
In this example, the variable containing XDomainRequest goes out of scope. If the user is unlucky, IE's Javascript garbage collector will run before send() asynchronously completes and the request will be aborted. Even though the XDomainRequest object can be captured into the OnLoad and OnError event handlers, IE will see that that entire object graph has no references to it and will garbage collect it. IE should be "pinning" the object until complete.
You'll notice quite a few other discussions on the internet mentioning that placing a setTimeout around the xdr.send(); call will somehow "solve" mysterious XDomainRequest failures. This is a kludge, and completely incorrect. All that's happening is that the XDomainRequest object is being "pinned" into the setTimeout closure and not subject to garbage collection as quickly. It doesn't solve the problem.
To correctly work around this issue, ensure the XDomainRequest is stored in a global variable until the request completes. For example:
var pendingXDR = [];
function removeXDR(xdr) {
// indexOf isn't always supported, you can also use jQuery.inArray()
var index = pendingXDR.indexOf(xdr);
if (index >= 0) {
pendingXDR.splice(index, 1);
}
}
function sendCrossDomainAjax(url, successCallback, errorCallback) {
var xdr = new XDomainRequest();
xdr.open("get", url);
xdr.onload = function() {
removeXDR(xdr);
successCallback();
}
xdr.onerror = function() {
removeXDR(xdr);
errorCallback();
}
xdr.send();
pendingXDR.push(xdr);
}
Issue 2 - Missing OnProgress EventHandler
This second issue is already known. Internet Explorer 9 introduced a regression in the XDomainRequest object where a missing (null) OnProgress event handler would cause the request to abort when it tries to report progress information.
For fast requests, IE9 never attempts to call the OnProgress event handler and the request succeeds. Certain conditions, such as when IE delays the request due to too many open connections, network latency, slow server responses, or large request or response payloads will cause IE9 to start to report progress information.
IE9 tries to call the event handler without first checking it exists, and the XDomainRequest object crashes and destroys itself internally.
To solve this issue, always ensure an event handler is attached to OnProgress. Given the bug, it's not a bad idea to defensively add event handlers to all of the object's events.
var xdr = new XDomainRequest();
xdr.open("get", url);
xdr.onprogress = function() { };
// regsister other event handlers
Other Issues
I've seem reports that XDomainRequest can fail if the event handlers are registered before .open() is called. Again, defensively, it's not a bad idea to register them between the .open() and .send() calls. I haven't personally verified whether it's an actual bug.
If you run into an "Access Denied" error, it's because XDomainRequest doesn't allow mismatched URI schemes between the target and host page. In other words, try don't call an HTTP resource from an HTTPS page.
Beware most of the XDomainRequest libraries on the internet. I looked at most of the popular ones, such as the various jQuery AJAX transport plugins (including the ones linked in another answer here).
And, of course, XDomainRequest is subject to all of it's normal limitations and constraints. These aren't bugs per-se, and compared with the alernatives (iframe kludges, Flash crossdomain.xml transports) they're not that bad.
I've posted a new jQuery AJAX XDomainRequest transport under a public domain license here: https://github.com/ebickle/snippets/tree/master/javascript/xdomainrequest
Had the exact same question. Short solution:
Use this code: https://github.com/jaubourg/ajaxHooks/blob/master/src/ajax/xdr.js
UPDATE: link is broken, find jaubourgs fix here: https://github.com/jaubourg/ajaxHooks/blob/master/src/xdr.js
Add xdr.onprogress = function() {}; in that xdr.js file
Details can be found on the jQuery topic discussion here
http://bugs.jquery.com/ticket/8283
in which the last reply included that xdr.onprogress fix which originated from this bug discussion which was aptly titled
"IE9 RTM - XDomainRequest issued requests may abort if all event handlers not specified"
http://social.msdn.microsoft.com/Forums/en-US/iewebdevelopment/thread/30ef3add-767c-4436-b8a9-f1ca19b4812e

Why does recursion through XHR's asynchronous onreadystatechange event consume the stack?

I'm getting a stack overflow error on some, but not all, IE7 machines.
This function downloads a bunch URL-based resources and does nothing with them. It runs on my login page and its purpose is to fetch static content while your typing in your credentials, so that when you really need it, the browser can get it from its local cache.
// Takes an array of resources URLs and preloads them sequentially,
// but asynchronously, using an XHR object.
function preloadResources(resources) {
// Kick it all off.
if (resources.length > 0) {
var xhr = getXHRObject(); // Prepare the XHR object which will be reused for each request.
xhr.open('GET', resources.shift(), true);
xhr.onreadystatechange = handleReadyStateChange;
xhr.send(null);
}
// Handler for the XHR's onreadystatechange event. Loads the next resource, if any.
function handleReadyStateChange() {
if (xhr.readyState == 4) {
if (resources.length > 0) {
xhr.open('GET', resources.shift(), true);
xhr.onreadystatechange = arguments.callee;
xhr.send(null);
}
}
}
// A safe cross-browser way to get an XHR object.
function getXHRObject() {
// Clipped for clarity.
}
} // End preloadResources().
It's called like this:
preloadResources([
'http://example.com/big-image.png',
'http://example.com/big-stylesheet.css',
'http://example.com/big-script.js']);
It recursively processes an array of URLs. I thought it wasn't susceptible to stack overflow errors because each recursion is called from an asynchronous event -- the XHR's onreadystatechange event (notice that I'm calling xhr.open() asynchronously). I felt doing so would prevent it from growing the stack.
I don't see how the stack is growing out of control? Where did I go wrong?
Doing the recursion with a timer prevented the stack overflow problem from appearing.
// Handler for the XHR's onreadystatechange event. Loads the next resource, if any.
function handleReadyStateChange() {
if (xhr.readyState == 4 && resources.length > 0) {
setTimeout(function() {
xhr.open('GET', resources.shift(), true);
xhr.onreadystatechange = handleReadyStateChange;
xhr.send(null);
}, 10);
}
}
I guess chaining XHR requests to one another consumes the stack. Chaining them together with a timer prevents that -- at least in IE7. I haven't seen the problem on other browsers, so I can't tell.
Can you perhaps console log or alert the value of arguments.callee ? I'm curious what would happen if it resolves to the arguments variable from the preloadResources() function instead of handleReadyStateChange(). Doesn't seem likely to me, but it jumps out at me just eyeballing your code.
In answer to your question, however - I think one bad practice in the code above is reusing the XmlHttpRequest object, especially without ever letting the lifecycle complete or calling xhr.abort(). That's a no-no I tucked away a while back. It's discussed here and various places around the web. Note that IE particularly doesn't work well with reusing the xhr. See http://ajaxian.com/archives/the-xmlhttprequest-reuse-dilemma .
Hope this helps,
Scott

Get if browser is busy

I'm trying to find a way to get if the browser is currently busy from JavaScript. I'm looking at making a Firefox extension to inject a Boolean value or something if the current page is loading something (either through ajax or just normal page loads), or the same with a Greasemonkey script, or through some JavaScript API (this would be the best solution, but from what I can see, nothing of the sort exists).
I was wondering what the best way to do this would be. I've been looking for Firefox Addon / Greasemonkey tutorials for making something like this and can't find anything. Does anyone have any tips or resources they could point me towards or better solutions for solving this?
Thanks
Edit: and by busy, I mostly just need to know if the browser is sending or receiving data from a server.
jQuery, a great javascript framework for DOM manipulation and performing ajax calls, provides two great hooks for determining when ajax calls are in progress:
$.ajaxStart() and $.ajaxStop()
Both of these hooks take a handler function that will be called when an ajax call is about to start, and when all ajax calls have ceased, respectively. These functions can be bound to any element on the page. You could set a global boolean value in your $.ajaxStart() handler to true and set it back to false in your $.ajaxStop() handler.
You could then check that boolean flag and determine whether ajax calls are in progress.
Something along these lines:
$(document).ajaxStart(function() {
window.ajaxBusy = true;
});
$(document).ajaxStop(function() {
window.ajaxBusy = false;
});
As far as determining when the browser is loading the current page, you could check
document.readyState. It returns a string of "loading" while the document is loading and a string of "complete" once it has loaded. You can bind a handler to document.onreadystatechange and set a global boolean that will indicate whether the document is still loading or not.
Something like this:
document.onreadystatechange = function() {
switch (document.readyState) {
case "loading":
window.documentLoading = true;
break;
case "complete":
window.documentLoading = false;
break;
default:
window.documentLoading = false;
}
}
EDIT:
It appears that $.ajaxStart() and $.ajaxStop() do NOT work for ajax calls invoked without jQuery. All XMLhttprequest objects have an event called readystatechange that you can attach a handler to. You could utilize this functionality to determine whether or not that individual call is done. You could push all references to outstanding calls onto an array, and in a setInterval() check that array's length. If it > 1, there are out standing ajax calls. It's a rough approach, and only one way of getting about it. There are probably other ways to do this. But here's the general approach:
// declare array to hold references to outstanding requets
window.orequets = [];
var req = XMLHttpRequest();
// open the request and send it here....
// then attach a handler to `onreadystatechange`
req.onreadystatechange = function() {
if (req.readyState != 4 || req.readyState != 3) {
// req is still in progress
orequests.push(req);
window.reqPos = orequests.length -1
} else {
window.orequests = orequests.slice(reqPos, reqPos + 1);
}
}
Do the above for each XMLHttpRequest() you will be sending, of course changing the request name for each one. Then run a setInterval() that runs every x amount of milliseconds, and checks the length property of orequests. If it is equal to zero, no requests are happening, if it is greater than zero, requests are still happening. If no requests are happening, you can either clear the interval through clearInterval() or keep it running.
Your setInterval might look something like this:
var ajaxInterval = setInterval(function() {
if (orequests.length > 0) {
// ajax calls are in progress
window.xttpBusy = true;
} else {
// ajax calls have ceased
window.xttpBusy = false;
// you could call clearInterval(ajaxInterval) here but I don't know if that's your intention
},
3000 // run every 3 seconds. (You can decide how often you want to run it)
});
Here's what I think I'll end up doing. This solution is like the one Alex suggested with the Jquery events, except that it works with anything that uses the XMLHttpRequest (Including Jquery):
var runningAjaxCount = 0;
var oldSend = XMLHttpRequest.prototype.send;
XMLHttpRequest.prototype.send = function() {
oldOnReady = this.onreadystatechange;
this.onreadystatechange = function() {
oldOnReady.call(this);
if(this.readyState == XMLHttpRequest.DONE) {
ajaxStopped();
}
}
ajaxStarted();
oldSend.apply(this, arguments);
}
function ajaxStarted() {
runningAjaxCount++;
}
function ajaxStopped() {
runningAjaxCount--;
}
function isCallingAjax() {
return runningAjaxCount > 0;
}
function isBrowserBusy() {
return document.readyState != "complete" && isCallingAjax();
}
The browser technically isn't ever "busy". Business is a very subjective term. Let's assume that the main thread is performing a simple while loop which blocks execution. This could be considered busy, but what if you have something like this:
function busy() {setTimeout(busy, 0);do_something();}
busy();
The browser isn't being blocked (per se), so whether or not the page is "busy" is very unclear. Also, that doesn't even begin to touch on web workers and code in the chrome.
You're going to be hard-pressed to do this, and even if you do, it likely won't work how you expect it to. Good luck, nonetheless.

How to reliably send a request cross domain and cross browser on page unload

I have javascript code that's loaded by 3rd parties. The javascript keeps track of a number of metrics, and when a user exits the page I'd like to send the metrics back to my server.
Due to XSS checks in some browsers, like IE, I cannot do a simple jquery.ajax() call. Instead, I'm appending an image src to the page with jquery. Here's the code, cased by browser:
function record_metrics() {
//Arbitrary code execution here to set test_url
$esajquery('#MainDiv').append("<img src='" + test_url + "' />");
}
if ($esajquery.browser.msie) {
window.onbeforeunload = function() { record_metrics(); }
} else {
$esajquery(window).unload(
function(){ record_metrics(); }
);
}
FF aborts the request to "test_url" if I use window.onbeforeunload, and IE8 doesn't work with jquery's unload(). IE8 also fails to work if the arbitrary test_url setting code is too long, although IE8 seems to work fine if the is immediately appended to the DOM.
Is there a better way to solve this issue? Unfortunately this really needs to execute when a user leaves the page.
For posterity's sake, here's what I ended up doing:
if ($.browser.msie) {
window.onbeforeunload = function() { $('#div1').append("<img src='record_call' />");
} else {
$(window).unload(
if ($.browser.webkit) {
$.ajax(url:record_call, async:false);
} else {
$('#div1').append("<script src='record_call' />");
}
);
}
I found that IE works appending an img, but not a script, possibly because the script is more resource intensive and it cuts out before trying to load it. For webkit, appending a script sometimes works, but appending an image never seemed to work. Lastly, I default to the script (mainly for FF) because older browser versions all seem to play well with it. IE blocks the AJAX call used by webkit because of xss.
In addition, IE never works with jquery's unload function, and the other browsers don't work with onbeforeunload, so those have to be cased. This certainly isn't a pretty solution, but it works most of the time.
A heads up for those trying to use Agmin's method above: When you use jQuery's append() method to append a script tag, the internal jQuery wiring uses jQuery.ajax() to execute the script, so if the script tag you're appending is on a different domain, Same Origin Policy will prevent the script from being executed.
From what I remember, some browsers only accept a text string for the onbeforeunload result. Try something more like this:
jQuery(window).bind('beforeunload', function() { record_metrics(); return ''; } );
Not sure, but that might even work for all browsers, but I'm only guessing at this point.
p.s. also see How can I override the OnBeforeUnload dialog and replace it with my own?
You should take a look at easyXDM library
http://easyxdm.net/
I think it will make your work easy regarding the limitation set in place by the Same Origin Policy

Categories

Resources