How synchronous AJAX call could cause memory leak? - javascript

I understand this general advice given against the use of synchronous ajax calls, because the synchronous calls block the UI rendering.
The other reason generally given is memory leak isssues with synchronous AJAX.
From the MDN docs -
Note: You shouldn't use synchronous XMLHttpRequests because, due to
the inherently asynchronous nature of networking, there are various
ways memory and events can leak when using synchronous requests. The
only exception is that synchronous requests work well inside Workers.
How synchronous calls could cause memory leaks?
I am looking for a practical example.
Any pointers to any literature on this topic would be great.

If XHR is implemented correctly per spec, then it will not leak:
An XMLHttpRequest object must not be garbage collected if its state is
OPENED and the send() flag is set, its state is HEADERS_RECEIVED, or
its state is LOADING, and one of the following is true:
It has one or more event listeners registered whose type is
readystatechange, progress, abort, error, load, timeout, or loadend.
The upload complete flag is unset and the associated
XMLHttpRequestUpload object has one or more event listeners registered
whose type is progress, abort, error, load, timeout, or loadend.
If an XMLHttpRequest object is garbage collected while its connection
is still open, the user agent must cancel any instance of the fetch
algorithm opened by this object, discarding any tasks queued for them,
and discarding any further data received from the network for them.
So after you hit .send() the XHR object (and anything it references) becomes immune to GC. However, any error or success will put the XHR into DONE state and it becomes subject to GC again. It wouldn't matter at all if the XHR object is sync or async. In case of a long sync request again it doesn't matter because you would just be stuck on the send statement until the server responds.
However, according to this slide it was not implemented correctly at least in Chrome/Chromium in 2012. Per spec, there would be no need to call .abort() since the DONE state means that the XHR object should already be normally GCd.
I cannot find even slightest evidence to back up the MDN statement and I have contacted the author through twitter.

I think that memory leaks are happening mainly because the garbage collector can't do its job. I.e. you have a reference to something and the GC can not delete it. I wrote a simple example:
var getDataSync = function(url) {
console.log("getDataSync");
var request = new XMLHttpRequest();
request.open('GET', url, false); // `false` makes the request synchronous
try {
request.send(null);
if(request.status === 200) {
return request.responseText;
} else {
return "";
}
} catch(e) {
console.log("!ERROR");
}
}
var getDataAsync = function(url, callback) {
console.log("getDataAsync");
var xhr = new XMLHttpRequest();
xhr.open("GET", url, true);
xhr.onload = function (e) {
if (xhr.readyState === 4) {
if (xhr.status === 200) {
callback(xhr.responseText);
} else {
callback("");
}
}
};
xhr.onerror = function (e) {
callback("");
};
xhr.send(null);
}
var requestsMade = 0
var requests = 1;
var url = "http://missing-url";
for(var i=0; i<requests; i++, requestsMade++) {
getDataSync(url);
// getDataAsync(url);
}
Except the fact that the synchronous function blocks a lot of stuff there is another big difference. The error handling. If you use getDataSync and remove the try-catch block and refresh the page you will see that an error is thrown. That's because the url doesn't exist, but the question now is how garbage collector works when an error is thrown. Is it clears all the objects connected with the error, is it keeps the error object or something like that. I'll be glad if someone knows more about that and write here.

If the synchronous call is interrupted (i.e. by a user event re-using the XMLHttpRequest object) before it completes, then the outstanding network query can be left hanging, unable to be garbage collected.
This is because, if the object that initiated the request does not exist when the request returns, the return cannot complete, but (if the browser is imperfect) remains in memory. You can easily cause this using setTimeout to delete the request object after the request has been made but before it returns.
I remember I had a big problem with this in IE, back around 2009, but I would hope that modern browsers are not susceptible to it. Certainly, modern libraries (i.e. JQuery) prevent the situations in which it might occur, allowing requests to be made without having to think about it.

Sync XHR block thread execution and all objects in function execution stack of this thread from GC.
E.g.:
function (b) {
var a = <big data>;
<work with> a and b
sync XHR
}
Variables a and b are blocked here (and whole stack too).
So, if GC started working then sync XHR has blocked stack, all stack variables will be marked as "survived GC" and be moved from early heap to the more persistent. And a tone of objects that should not survive even the single GC will live many Garbage Collections and even references from these object will survive GC.
About claims stack blocks GC, and that object marked as long-live objects: see section Conservative Garbage Collection in
Clawing Our Way Back To Precision.
Also, "marked" objects GCed after the usual heap is GCed, and usually only if there is still need to free more memory (as collecting marked-and-sweeped objs takes more time).
UPDATE:
Is it really a leak, not just early-heap ineffective solution?
There are several things to consider.
How long these object will be locked after request is finished?
Sync XHR can block stack for a unlimited amount of time, XHR has no timeout property (in all non-IE browsers), network problems are not rare.
How much UI elements are locked? If it block 20M of memory for just 1 sec == 200k lead in a 2min. Consider many background tabs.
Consider case when single sync blocks tone of resources and browser
goes to swap file
When another event tries to alter DOM in may be blocked by sync XHR, another thread is blocked (and whole it's stack too)
If user will repeat the actions that lead to the sync XHR, the whole browser window will be locked. Browsers uses max=2 thread to handle window events.
Even without blocking this consumes lots of OS and browser internal resources: thread, critical section resources, UI resources, DOM ... Imagine that your can open (due to memory problem) 10 tabs with sites that use sync XHR and 100 tabs with sites that use async XHR. Is not this memory leak.

Memory leaks using syncronous AJAX requests are often caused by:
using setInterval/setTimout causing circular calls.
XmlHttpRequest - when the reference is removed, so xhr becomes inaccessible
Memory leak happens when the browser for some reason doesn’t release memory from objects which are not needed any more.
This may happen because of browser bugs, browser extensions problems and, much more rarely, our mistakes in the code architecture.
Here's an example of a memory leak being cause when running setInterval in a new context:
var
Context = process.binding('evals').Context,
Script = process.binding('evals').Script,
total = 5000,
result = null;
process.nextTick(function memory() {
var mem = process.memoryUsage();
console.log('rss:', Math.round(((mem.rss/1024)/1024)) + "MB");
setTimeout(memory, 100);
});
console.log("STARTING");
process.nextTick(function run() {
var context = new Context();
context.setInterval = setInterval;
Script.runInContext('setInterval(function() {}, 0);',
context, 'test.js');
total--;
if (total) {
process.nextTick(run);
} else {
console.log("COMPLETE");
}
});

Related

XMLHttpRequest returning with status 200, but 'onreadystatechange' event not fired

We have been receiving an intermittent bug with the XMLHttpRequest object when using IE11. Our codebase is using legacy architecture, so this browser is required.
After clicking a button, the browser launches an out-of-band process by creating a new ActiveX control which integrates with a camera to capture an image. This control appears to be working fine... it allows the operator to capture the image, and the Base64 content of the image is returned out of the control back to the browser interface, so I think we can rule out a problem with this object.
Once the image is returned to the browser, the browser performs an asynchronous 'ping' to the web server to check if the IIS session is still alive or it has expired (because the out-of-band image capture process forbids control of the browser while it is open).
The ping to the server returns successfully (and running Fiddler I can see that the response has status 200), with the expected response data:
<sessionstate>ok</sessionstate>
There is a defined 'onreadystatechange' function which should be fired on this response, and the majority of times this seems to fire correctly. However, on the rare occasion it does appear, it continues to happen every time.
Here is a snippet of the code... we expect the 'callback()' function to be called on a successful response to Timeout.asp:
XMLPoster.prototype.checkSessionAliveAsync = function(callback) {
var checkSessionAlive = new XMLHttpRequest();
checkSessionAlive.open("POST", "Timeout.asp?Action=ping", true);
checkSessionAlive.setRequestHeader("Content-Type", "application/x-www-form-urlencoded");
checkSessionAlive.onreadystatechange = function() {
if (checkSessionAlive.readyState == 4) {
if (checkSessionAlive.responseText.indexOf("expired") != -1 || checkSessionAlive.status !== 200) {
eTop.window.main.location = "timeout.asp";
return;
}
callback(checkSessionAlive.responseText);
}
}
checkSessionAlive.send();
}
Has anyone seen anything like this before? I appreciate that using legacy software is not ideal, but we are currently limited to using it.

javascript websockets - control initial connection/when does onOpen get bound

Two related questions that may be more rooted in my lack of knowledge of how/if browsers pre-parse javascript:
var ws = new WebSocket("ws://ws.my.url.com");
ws.onOpen = function() { ... };
There appears to be no way to directly control the initialisation of a WebSocket, beyond wrapping it in a callback, so I assume the connection is created as soon as the javascript code is loaded and get to the constructor?
When does the onOpen property get attached to ws? Is there any possibility of a race condition (if for some reason you had some code in between the definition of the socket and the definition of onOpen?) so that onOpen is undecidably bound before/after the connection is established (I know you could optionally check ws.readyState). Supplementary to this, is the WebSocket handshake blocking?
I realise it's all a draft at the moment, possibly implementation dependent and I may have missed something blindingly obvious, but I couldn't see anything particular pertinent on my internet searches/skim through the draft w3c spec, so any help in my understanding of websockets/javascript's inner workings is very much appreciated!
JavaScript is single threaded which means the network connection can't be established until the current scope of execution completes and the network execution gets a chance to run. The scope of execution could be the current function (the connect function in the example below). So, you could miss the onopen event if you bind to it very late on using a setTimeout e.g. in this example you can miss the event:
View: http://jsbin.com/ulihup/edit#javascript,html,live
Code:
var ws = null;
function connect() {
ws = new WebSocket('ws://ws.pusherapp.com:80/app/a42751cdeb5eb77a6889?client=js&version=1.10');
setTimeout(bindEvents, 1000);
setReadyState();
}
function bindEvents() {
ws.onopen = function() {
log('onopen called');
setReadyState();
};
}
function setReadyState() {
log('ws.readyState: ' + ws.readyState);
}
function log(msg) {
if(document.body) {
var text = document.createTextNode(msg);
document.body.appendChild(text);
}
}
connect();
If you run the example you may well see that the 'onopen called' log line is never output. This is because we missed the event.
However, if you keep the new WebSocket(...) and the binding to the onopen event in the same scope of execution then there's no chance you'll miss the event.
For more information on scope of execution and how these are queued, scheduled and processed take a look at John Resig's post on Timers in JavaScript.
TL;DR - The standard states that the connection can be opened "while the [JS] event loop is running" (e.g. by the browser's C++ code), but that firing the open event must be queued to the JS event loop, meaning any onOpen callback registered in the same execution block as new WebSocket(...) is guaranteed to be executed, even if the connection gets opened while the current execution block is still executing.
According to The WebSocket Interface specification in the HTML Standard (emphasis mine):
The WebSocket(url, protocols) constructor, when invoked, must run these steps:
Let urlRecord be the result of applying the URL parser to url.
If urlRecord is failure, then throw a "SyntaxError" DOMException.
If urlRecord's scheme is not "ws" or "wss", then throw a "SyntaxError" DOMException.
If urlRecord's fragment is non-null, then throw a "SyntaxError" DOMException.
If protocols is a string, set protocols to a sequence consisting of just that string.
If any of the values in protocols occur more than once or otherwise fail to match the requirements for elements that comprise the value of Sec-WebSocket-Protocol fields as defined by The WebSocket protocol, then throw a "SyntaxError" DOMException.
Run this step in parallel:
Establish a WebSocket connection given urlRecord, protocols, and the entry settings object. [FETCH]
NOTE If the establish a WebSocket connection algorithm fails, it triggers the fail the WebSocket connection algorithm, which then invokes the close the WebSocket connection algorithm, which then establishes that the WebSocket connection is closed, which fires the close event as described below.
Return a new WebSocket object whose url is urlRecord.
Note the establishment of the connection is run 'in parallel', and the specification further states that "...in parallel means those steps are to be run, one after another, at the same time as other logic in the standard (e.g., at the same time as the event loop). This standard does not define the precise mechanism by which this is achieved, be it time-sharing cooperative multitasking, fibers, threads, processes, using different hyperthreads, cores, CPUs, machines, etc."
Meaning that the connection can theoretically be opened before onOpen registration, even if onOpen(...) is the next statement after the constructor call.
However... the standard goes on to state under Feedback from the protocol:
When the WebSocket connection is established, the user agent must queue a task to run these steps:
Change the readyState attribute's value to OPEN (1).
Change the extensions attribute's value to the extensions in use, if it is not the null value. [WSP]
Change the protocol attribute's value to the subprotocol in use, if it is not the null value. [WSP]
Fire an event named open at the WebSocket object.
NOTE Since the algorithm above is queued as a task, there is no race condition between the WebSocket connection being established and the script setting up an event listener for the open event.
So in a browser or or library that adheres to the HTML Standard, a callback registered to WebSocket.onOpen(...) is guaranteed to execute, if it is registered before the end of the execution block in which the constructor is called, and before any subsequent statement in the same block that releases the event loop (e.g. await).
#leggetter is right, following code did executes sequentially:
(function(){
ws = new WebSocket("ws://echo.websocket.org");
ws.addEventListener('open', function(e){
console.log('open', e);
ws.send('test');
});
ws.addEventListener('message', function(e){console.log('msg', e)});
})();
But, in W3C spec there is a curious line:
Return a new WebSocket object, and continue these steps in the background (without blocking scripts).
It was confusing for me, when I was learning browser api for it. I assume that user agents ignoring it, or I misinterpreting it.
Pay attention to the fact that I/O may occur within the scope of execution.
For example, in the following code
var ws = new WebSocket("ws://localhost:8080/WebSockets/example");
alert("Hi");
ws.onopen = function(){
writeToScreen("Web Socket is connected!!" + "<br>");
};
function writeToScreen(message) {
var div = document.getElementById('test');
div.insertAdjacentHTML( 'beforeend', message );
}
, the message "Web Socket is connected" will appear or not, depending how much time it took you to close the "Hi" alert
No actual I/O will happen until after your script finishes executing, so there should not be a race condition.

IE, XDomainRequest not always work

I am trying to do cross-domain on IE.
I used XDomainRequest, and implanted logging for all events (onerror, onload, onprogress and ontimeout) to monitor the progress.
It works sometime, but not always (one computer, IE9, same site, same request, 1 out of 3 or 4 works; another computer, IE8, maybe 1 out of 2 works). I didn't get any useful information from the logging, because there was nothing triggered.
I am very confused. Any debugging tool for IE? Why some time XDomainRequest just doesn't work?
Thanks a lot
coronin
There are at least two significant bugs in the XDomainRequest object, one that affects IE8 and another that affects IE9.
Issue 1 - Garbage Collection
In Internet Explorer 8, the XDomainRequest object is incorrectly subject to garbage collection after send() has been called but not yet completed. The symptoms of this bug are the Developer Tools' network trace showing "Aborted" for the requests and none of the error, timeout, or success event handlers being called.
Typical AJAX code looks a bit like this:
function sendCrossDomainAjax(url, successCallback, errorCallback) {
var xdr = new XDomainRequest();
xdr.open("get", url);
xdr.onload = function() { successCallback(); }
xdr.onerror = function() { errorCallback(); }
xdr.send();
}
In this example, the variable containing XDomainRequest goes out of scope. If the user is unlucky, IE's Javascript garbage collector will run before send() asynchronously completes and the request will be aborted. Even though the XDomainRequest object can be captured into the OnLoad and OnError event handlers, IE will see that that entire object graph has no references to it and will garbage collect it. IE should be "pinning" the object until complete.
You'll notice quite a few other discussions on the internet mentioning that placing a setTimeout around the xdr.send(); call will somehow "solve" mysterious XDomainRequest failures. This is a kludge, and completely incorrect. All that's happening is that the XDomainRequest object is being "pinned" into the setTimeout closure and not subject to garbage collection as quickly. It doesn't solve the problem.
To correctly work around this issue, ensure the XDomainRequest is stored in a global variable until the request completes. For example:
var pendingXDR = [];
function removeXDR(xdr) {
// indexOf isn't always supported, you can also use jQuery.inArray()
var index = pendingXDR.indexOf(xdr);
if (index >= 0) {
pendingXDR.splice(index, 1);
}
}
function sendCrossDomainAjax(url, successCallback, errorCallback) {
var xdr = new XDomainRequest();
xdr.open("get", url);
xdr.onload = function() {
removeXDR(xdr);
successCallback();
}
xdr.onerror = function() {
removeXDR(xdr);
errorCallback();
}
xdr.send();
pendingXDR.push(xdr);
}
Issue 2 - Missing OnProgress EventHandler
This second issue is already known. Internet Explorer 9 introduced a regression in the XDomainRequest object where a missing (null) OnProgress event handler would cause the request to abort when it tries to report progress information.
For fast requests, IE9 never attempts to call the OnProgress event handler and the request succeeds. Certain conditions, such as when IE delays the request due to too many open connections, network latency, slow server responses, or large request or response payloads will cause IE9 to start to report progress information.
IE9 tries to call the event handler without first checking it exists, and the XDomainRequest object crashes and destroys itself internally.
To solve this issue, always ensure an event handler is attached to OnProgress. Given the bug, it's not a bad idea to defensively add event handlers to all of the object's events.
var xdr = new XDomainRequest();
xdr.open("get", url);
xdr.onprogress = function() { };
// regsister other event handlers
Other Issues
I've seem reports that XDomainRequest can fail if the event handlers are registered before .open() is called. Again, defensively, it's not a bad idea to register them between the .open() and .send() calls. I haven't personally verified whether it's an actual bug.
If you run into an "Access Denied" error, it's because XDomainRequest doesn't allow mismatched URI schemes between the target and host page. In other words, try don't call an HTTP resource from an HTTPS page.
Beware most of the XDomainRequest libraries on the internet. I looked at most of the popular ones, such as the various jQuery AJAX transport plugins (including the ones linked in another answer here).
And, of course, XDomainRequest is subject to all of it's normal limitations and constraints. These aren't bugs per-se, and compared with the alernatives (iframe kludges, Flash crossdomain.xml transports) they're not that bad.
I've posted a new jQuery AJAX XDomainRequest transport under a public domain license here: https://github.com/ebickle/snippets/tree/master/javascript/xdomainrequest
Had the exact same question. Short solution:
Use this code: https://github.com/jaubourg/ajaxHooks/blob/master/src/ajax/xdr.js
UPDATE: link is broken, find jaubourgs fix here: https://github.com/jaubourg/ajaxHooks/blob/master/src/xdr.js
Add xdr.onprogress = function() {}; in that xdr.js file
Details can be found on the jQuery topic discussion here
http://bugs.jquery.com/ticket/8283
in which the last reply included that xdr.onprogress fix which originated from this bug discussion which was aptly titled
"IE9 RTM - XDomainRequest issued requests may abort if all event handlers not specified"
http://social.msdn.microsoft.com/Forums/en-US/iewebdevelopment/thread/30ef3add-767c-4436-b8a9-f1ca19b4812e

Why does recursion through XHR's asynchronous onreadystatechange event consume the stack?

I'm getting a stack overflow error on some, but not all, IE7 machines.
This function downloads a bunch URL-based resources and does nothing with them. It runs on my login page and its purpose is to fetch static content while your typing in your credentials, so that when you really need it, the browser can get it from its local cache.
// Takes an array of resources URLs and preloads them sequentially,
// but asynchronously, using an XHR object.
function preloadResources(resources) {
// Kick it all off.
if (resources.length > 0) {
var xhr = getXHRObject(); // Prepare the XHR object which will be reused for each request.
xhr.open('GET', resources.shift(), true);
xhr.onreadystatechange = handleReadyStateChange;
xhr.send(null);
}
// Handler for the XHR's onreadystatechange event. Loads the next resource, if any.
function handleReadyStateChange() {
if (xhr.readyState == 4) {
if (resources.length > 0) {
xhr.open('GET', resources.shift(), true);
xhr.onreadystatechange = arguments.callee;
xhr.send(null);
}
}
}
// A safe cross-browser way to get an XHR object.
function getXHRObject() {
// Clipped for clarity.
}
} // End preloadResources().
It's called like this:
preloadResources([
'http://example.com/big-image.png',
'http://example.com/big-stylesheet.css',
'http://example.com/big-script.js']);
It recursively processes an array of URLs. I thought it wasn't susceptible to stack overflow errors because each recursion is called from an asynchronous event -- the XHR's onreadystatechange event (notice that I'm calling xhr.open() asynchronously). I felt doing so would prevent it from growing the stack.
I don't see how the stack is growing out of control? Where did I go wrong?
Doing the recursion with a timer prevented the stack overflow problem from appearing.
// Handler for the XHR's onreadystatechange event. Loads the next resource, if any.
function handleReadyStateChange() {
if (xhr.readyState == 4 && resources.length > 0) {
setTimeout(function() {
xhr.open('GET', resources.shift(), true);
xhr.onreadystatechange = handleReadyStateChange;
xhr.send(null);
}, 10);
}
}
I guess chaining XHR requests to one another consumes the stack. Chaining them together with a timer prevents that -- at least in IE7. I haven't seen the problem on other browsers, so I can't tell.
Can you perhaps console log or alert the value of arguments.callee ? I'm curious what would happen if it resolves to the arguments variable from the preloadResources() function instead of handleReadyStateChange(). Doesn't seem likely to me, but it jumps out at me just eyeballing your code.
In answer to your question, however - I think one bad practice in the code above is reusing the XmlHttpRequest object, especially without ever letting the lifecycle complete or calling xhr.abort(). That's a no-no I tucked away a while back. It's discussed here and various places around the web. Note that IE particularly doesn't work well with reusing the xhr. See http://ajaxian.com/archives/the-xmlhttprequest-reuse-dilemma .
Hope this helps,
Scott

Firefox not starting onreadystatechange function

I made some javascript code for my website, it works without problem on opera and chrome, but not on firefox.
Here is script:
function checkstate(who,row,cell) {
var zwrot="";
var mouseEvent='onmouseover="javascript:bubelon(this.id);" onmouseout="bubeloff();"';
var cellid="";
ajax=new XMLHttpRequest();
ajax.onreadystatechange=function(aEvt) {
if (ajax.readyState===4 && ajax.status===200) {
alert("im here!");
}
};
ajax.open('GET',"oth/work_prac_stan.php?usr="+who,false);
ajax.send();
}
function sprawdzstan() {
var lol="";
var table = document.getElementById("usery");
var re = /^<a\shref\=/g;
for (var i = 1, row; row = table.rows[i]; i ++) {
if (row.cells[0].innerHTML.match(re)) {
checkstate(row.cells[1].innerHTML,row,2);
} else {
checkstate(row.cells[0].innerHTML,row,1);
}
}
}
The problem is, that firefox is not running function assigned to onreadystatechange. I checked in firebug, that response from php file is correct.
Where is the problem? It works on chrome and opera, firefox just dont, no error in console, nothing.
Updated answer
According to Mozilla's docs, you don't use onreadystatechange with synchronous requests. Which kind of makes sense, since the request doesn't return until the ready state is 4 (completed), though I probably wouldn't have designed it that way.
Original answer
Not immediately seeing a smoking gun, but: Your ajax variable is not defined within the function, and so you're almost certainly overwriting it on every iteration of the loop in sprawdzstan. Whether that's a problem remains to be seen, since you're using a synchronous ajax call. In any case, add a var ajax; to checkstate to ensure that you're not falling prey to the Horror of Implicit Globals.
Off-topic: If you can possibly find a way to refactor your design to not use a synchronous ajax request, strongly recommend doing that. Synchronous requests lock up the UI of the browser (to a greater or lesser degree depending on the browser, but many — most? — completely lock up, including other unrelated tabs). It's almost always possible to refactor and use an asynchronous request instead.
Off-topic 2: You aren't using mouseEvent in your code, but if you were, you would want to get rid of those javascript: prefixes on the onmouseover and onmouseout attributes. Those attributes are not URLs, the prefix is not (there) a protocol specifier (it's a label, which you're not using).
For those who still encounter this problem...
You can use the below code. What I did is remove the function
ajax.onreadystatechange=function(aEvt) {
and transfer the alert("im here!"); after the ajax.send();
ajax=new XMLHttpRequest();
ajax.open('GET',"oth/work_prac_stan.php?usr="+who,false);
ajax.send();
alert("im here!");

Categories

Resources