I've been messing with Chrome Heap profiler and some basic Backbone objects just to get an idea of garbage collection. I ran into some curious and confusing behavior.
Using a base page that has only jquery-1.7.2,underscore-1.3.3, and backbone-0.9.2 loaded.
If I run the following code from a Chrome (win/19) js console window the collection will not be garbage collected. In the heap profiler, 'Object's retaining tree' it will list the 50 users and the circular references between the models and the collection, but it will also list 2 that say b in function() #234234. Expanding out those references the references look to be from jQuery's ajax callback handlers.
(function () {
var collection = new Backbone.Collection();
collection.url = '/api/users/';
collection.fetch();
setTimeout(function wtf() { $('#content').append($('<div>' + collection.models[0].get('name') + '</div>')); }, 3000);
})()
If I run the following code the collection will be properly garbage collected as expected. The difference is the last line where I make some random ajax call that seems to shift the jQuery ajax callbacks off the collection.
(function () {
var collection = new Backbone.Collection();
collection.url = '/api/users/';
collection.fetch();
setTimeout(function wtf() { $('#content').append($('<div>' + collection.models[0].get('name') + '</div>')); }, 3000);
$.ajax({ url: '/api/users/' });
})()
Is this expected behavior or a bug or what? It'd suck if I have random objects hanging around because they had done ajax requests in the past.
Update: so I was testing this further and it appears to only affect the first or second instances. If I create a setinterval that creates a new collection and calls fetch every 1 second the first 1-2 instances created will remain in memory indefinitely (20min+) while all other instances created will be disposed properly.
If I call jquerys Ajax function and in the success handler hand the data to the new collection as a parameter at creation this behavior does not occur at all. All instances created are cleaned up properly.
It appears that the function Backbone.wrapError is part of the problem along with the wrapping of success.
Related
I was under the impression that all DOM manipulations were synchronous.
However, this code is not running as I expect it to.
RecordManager.prototype._instantiateNewRecord = function(node) {
this.beginLoad();
var new_record = new Record(node.data.fields, this);
this.endLoad();
};
RecordManager.prototype.beginLoad = function() {
$(this.loader).removeClass('hidden');
};
RecordManager.prototype.endLoad = function() {
$(this.loader).addClass('hidden');
};
The Record constructor function is very large and it involves instantiating a whole bunch of Field objects, each of which instantiates some other objects of their own.
This results in a 1-2 second delay and I want to have a loading icon during this delay, so it doesn't just look like the page froze.
I expect the flow of events to be:
show loading icon
perform record instantiation operation
hide loading icon
Except the flow ends up being:
perform record instantiation operation
show loading icon
hide loading icon
So, you never even see the loading icon at all, I only know its loading briefly because the updates in the chrome development tools DOM viewer lag behind a little bit.
Should I be expecting this behavior from my code? If so, why?
Yes, this is to be expected. Although the DOM may have updated, until the browser has a chance to repaint, you won't see it. The repaint will get queued the same way as all other things get queued in the browser (ie it won't happen until the current block of JavaScript has finished executing), though pausing in a debugger will generally allow it to happen.
In your case, you can fix it using setTimeout with an immediate timeout:
RecordManager.prototype._instantiateNewRecord = function(node) {
this.beginLoad();
setTimeout(function() {
var new_record = new Record(node.data.fields, this);
this.endLoad();
}, 0);
};
This will allow the repaint to happen before executing the next part of your code.
JavaScript is always synchronous. It mimics multi-threaded behavior when it comes to ajax calls and timers, but when the callback gets returned, it will be blocking as usual.
That said, you most likely have a setTimeout in that constructor somewhere (or a method you're using does). Even if it's setTimeout(fnc, 0).
In Ext.tree.TreePanel, when we load the tree, there in no event to check if ALL the tree-nodes are completely loaded.
What we do is, we recursively make asynchronous calls and let the node expands per the node having expanded property to true. How can we find all the asyc. calls has completed and the treePanel is loaded completly ?
The idea behind is when the nodes of treepanel completely loaded then we have to enable a button representing that the tree is available for end user for further operations.
Ext.Ajax.request(...) returns an object and if that object contains an xhr property the request is not completed.
var req = Ext.Ajax.request(..);
if(!req.xhr)
//request is finished
else
// request is not finished
Though i would recommend to return a total count of your nodes with every request. So i'd return a json like this:
{
data: [...], // treenodes
totalCount: 100
}
now you can check in every request success function if your treestore already contains all nodes.
thanks for your input #JuHwon. However, your was not suited well in that scenario. Because the Ajax calls were out of the box, As I mentioned the tree triggers call based on a property(may be isSelectable = true, as far as I can recall now). So, for fulfilling this use case I have used setTimeOut and clearTimeOut method for dynamically push a delay of 2000 ms. I didn't find more closer to it.
I'm trying to have my backbone application check the server as often as possible for updates to a model, similar to how twitter's site has new tweets that are automatically added.
My current setup is checking an external application through their api so I have no access to their server which leaves me to rely on the client side to do the checking without being too memory hungry, how can I achieve this?
In Javascript the only way you can really control timing is through setTimeout/setInterval; there is no "more sophisticated" mechanism, unless you count helper functions (eg. 'delay') which just wrap setTimeout/setInterval.
So, dmi3y's answer was correct. However, since you mentioned Backbone in both the tags and in the description, here's a more Backbone-ish version...
var YourModelClass = Backbone.Model.extend({url: remoteUrl});
var instance = new YourModelClass();
var seconds = 5;
window.setInterval(_.bind(instance.fetch, instance), 1000 * seconds);
or, if you wanted to build it in to your class ...
var YourModelClass = Backbone.Model.extend({
url: remoteUrl,
initialize: function() {
var seconds = 5;
window.setInterval(_.bind(this.fetch, this), 1000 * seconds);
}
});
var instance = new YourModelClass();
It's also worth mentioning that setInterval returns an object which you can pass to clearInterval if you want to stop "polling".
P.S. Just in case you're not familiar with _.bind, it comes from the Underscore library, which Backbone depends on so you already have it. All it does is fix this in place, so that when your timeout/interval function resolves, the this inside it will be the second argument to _.bind (and not window, which is what it would normally be).
possible solution
(function IcallTheShoots(){
console.log('am I?'); // any way you able communicate with server
window.setTimeout(IcallTheShoots, 1500);
})();
why setTimeout instead of setInterval, cause it makes sure next cycle will be called only when current is finished
I am coding a Chrome extension.
I have a $.post() command that runs on a timer in the background (setInterval).
The callback invokes a parse function:
function parseData(new_data) {
new_data = $.parseJSON( new_data );
for(var x=0; x<new_data.length; x++) {
var obj = new CustomObj( new_data[x] );
// I commented out code here in order to help isolate the problem.
}
}
CustomObj is prototyped in a typical JS manner...
function CustomObj(data){
this.data = data;
}
CustomObj.prototype.getName = function() {
return this.data.name;
}
// Of course, there are a few more methods here...
The problem:
The extension will cause major lag. If I set it to do the $.post() every 10s (simply to speed up the appearance of the problem), within 5 minutes refreshing any tab in Chrome will show "Waiting on [My Extension]" for about 30s. Eventually the browser will more or less lock up.
What I've discovered: if I comment out the innards of the for() loop, everything is just peachy. No lag ever. If I simply put the above line in the for(); loop (creating the CustomObj), the problem returns.
It seems like a garbage collection problem, as far as I can tell. I've tried implicitly defining the obj variable as well as explicitly deleting it (though Deleting Objects in JavaScript makes me believe that delete is insufficient ). Nothing seems to work.
Thanks.
I am experiencing a slow memory leak in both IE and Firefox using a combination of ASP.NET AJAX and jQuery. My scenario is very similar to the one described here : Preventing AJAX memory leaks except using jquery and asp.net AJAX, not protyotype: I have a webpage displaying data in an UpdatePanel that is refreshed every 60 seconds using a timer. in the AJAX javascript pageLoad function that is called on every "partial postback", I re-bind events because they are lost in the asp.net partial postback:
function pageLoad(sender, args) {
$("#item").unbind();
$("#item").hover(
function() {
// do something
},
function() {
// do something
});
}
so this is called every 60 seconds. Could this alone be the cause of a memory leak?
Do this instead:
$(function() { //.ready shortcut...
$("#item").live("hover",
function() {
// do something
},
function() {
// do something
});
});
Note, this requires jQuery 1.4.1, but acts entirely different in terms of memory. It attaches to the entire DOM watching for the event to bubble instead of attaching a new event to every object your're inserting every 60 seconds.
Yes, it could be.
The first thing to try would be to take the two functions defined there (if possible) and place them in a higher level so that they are only defined once.