ajax website memory usage accumulation / Chrome - javascript

I have an ajax/pushstate website, which loads data(pages) through ajax. In Chrome (perhaps other browsers also), I have noticed that memory consumption accumulates when navigating pages, especially pages with lots of images. You can check the website in mention here:
mjau-mjau.com
I know there is a lot of information about general javascript memory leakage and memory management, but this issue seems in most sense directly related to pages that contain a lot of images(often large ones). It is as if the image files don't get flushed from memory after their html context is replaced when navigating between pages.
Is there anything I might be overlooking? Shouldn't the browser automatically reclaim memory for html data that has been removed, including images? Could it be related to GPU layers somehow, when transitioning between pages with transform?
Suggestions are appreciated.

I tracked down this bug, and found that it only occurs in Chrome browser. Furthermore, it seems directly related to lazy loaded images and ajax pages. Loaded images are not being removed from memory when they are removed from DOM after an ajax call. I managed to find a hack which somehow clears them from memory before DOM is updated with new data.
$('#content').find("img").off().attr('src', '').remove();
I am not sure if all the above are required, but the combo works to fair degree. The strange thing is, the memory is not reclaimed until another page with images replaces the #content DOM element. Buggy memory management in Chrome!

Related

MobileSafari crashing due to excessive memory consumption

I'm currently working on an application, that utilises SoundJS. I inherited the codebase after it was found not to work correctly on an iPad - the issue being it creates a manifest of around 16 MP3 files, totalling approximately 35.7mb. This obviously causes the iPad issues, and at 16mb it crashes.
The crash log shows it's due to the Per Process Limit according to the Diagnostics and Usage Logs.
I've done some digging in to the underlying structure of SoundJS and can see it's default behaviour is to utilise WebAudio, via a XHR. This is then parsed as an ArrayBuffer (or an array of ArrayBuffers).
At the moment this means that, after preloading, we have 35.7mb of data in ArrayBuffers - not good. This is after crunching the file size down! There will only ever be one audio file playing at any one time - and this is one file per section of the app; apart from during transitions, where two may be fading in to eachother.
Is there an easy way to free the resources up from the underlying structure; i.e the ArrayBuffers? As far as I'm aware, the previous developer did try using calls to the SoundJS .removeSound() method to free up some memory, but the results weren't good.
At the moment I'm looking at creating an object acting as a registry of all the filenames, and rather than loading them through a manifest - loading them individually and removing them as soon as they are used. However, I'm expecting this to cause headaches with easing one file in to another during playback. Furthermore, I expect that may actually result in a problem akin to the Images one where MobileSafari didn't release the memory allocated to image - even after deletion. (The correct fix being to reset the 'src' attribute of the image element prior to deletion)
Does anyone know of a surefire workaround for loading such large amounts of data in a web app targeting iPad?
testing SoundJS has shown some issues with iPad not releasing the memory properly. Unfortunately from a library perspective, there isn't much we can do about it.
If you only ever play sounds 1 at a time, I would suggest loading them only as you need them and removing them after use. The biggest issue you will find with that is waiting for sound to load, so you may want to implement a smart preload of the next sound you expect to use (meaning you always have the current and the next sound loaded). In theory this can keep you below the iPad 16 mb memory limit. However, if the iPad refuses to free up the memory, you may need to introduce some form of cache busting.
Another solution would be to reduce the file size through lossy compression, which it sounds like has already been attempted.
A third option might be implementing some form of streaming audio, but this is not something SoundJS can help with.
Hope that helps.

HTML5 LocalStorage seems to become corrupt

I'm using BootUp.js (https://github.com/TradeMe/bootup.js) to load and store CSS and JavaScript files into HTML5 LocalStorage. The site is mobile focused so the time saving and speed boost this creates is great! However, I've noticed the odd occasion were the CSS (never noticed it with JS) becomes corrupt in the storage and so the site renders horribly until the storage is cleared and the CSS files are refetched from the server.
I've seen this happen very sporadically on Safari on an iPhone 4 (iOS 6), Chrome on a Galaxy S3 and Chrome on a Nexus 7 - so it doesn't seem to be limited to any particular device, browser or OS. Is this an issue any one has come across before? Is it possible that the data has just somehow become corrupt? Are there any known issues with WebKit (I guess) that could cause it?
I'm planning to implement a work-around by storing some kind of checksum that can be generated in JS to ensure the data is fully there. If not, clear it out and fetch from the server.
I'd first use this:
http://arty.name/localstorage.html
Mobile browsers tend to cut back on storage space due to obvious memory limitations. Your CSS and Javascript might be too big, even when minified.
Other thing I can think of for this behavior is that localStorage might become corrupt when starting a save, and refreshing the page at the same time. I'm not familiar with the exact works of browsers, but I'm guessing they might stop a save in the middle.
Also, have a look here:
http://hacks.mozilla.org/2012/03/there-is-no-simple-solution-for-local-storage/

browser (javascript) resource problems

I've lately been running into odd issues, which I'm starting to think are related to resource starvation in the browser.
In FF:
I'd been testing one of our web apps and suddenly things that should disappear after a couple seconds stopped disappearing. I tracked back to setTimeout just flat out refusing to work. After reloading the browser it was all clear, no issues.
In IE:
I regularly see issues where IE will refuse to do transparency all the sudden, simply reloading the page clears this up.
In both:
Though I can't say its related for sure, I see unexplainable behavior, things along the lines of variables not being available (undefined) when they should be.
Both browsers also show a steady increase in memory usage over time (memory leaks).
The javascript in the web app is heavy and it is a single load page (making those memory issues mentioned all the more painful). There are lots of in-efficiency, and various things that make one say "why would you do that?".
Has anyone encountered such things? Can you point out general resources that will help identify and resolve these issues?
You could try running your application against the Chrome Profiler http://code.google.com/chrome/devtools/docs/overview.html. You can profile the CPU and get snapshots of the browser heap, that should help locate any rogue stuff.
If your application is designed to work with the Internet Explorer: The Developer Toolbar also has a profiler.

jQuery or javascript to find memory usage of page

Is there a way to find out how much memory is being used by a web page, or by my jquery application?
Here's my situation:
I'm building a data heavy webapp using a jquery frontend and a restful backend that serves data in JSON. The page is loaded once, and then everything happens via ajax.
The UI provides users with a way to create multiple tabs within the UI, and each tab can contain lots and lots of data. I'm considering limiting the number of tabs they can create, but was thinking it would be nice to only limit them once memory usage has gone above a certain threshold.
Based on the answers, I'd like to make some clarfications:
I'm looking for a runtime solution (not just developer tools), so that my application can determine actions based on memory usage in a user's browser.
Counting DOM elements or document size might be a good estimation, but it could be quite inaccurate since it wouldn't include event binding, data(), plugins, and other in-memory data structures.
2015 Update
Back in 2012 this wasn't possible, if you wanted to support all major browsers in-use. Unfortunately, right now this is still a Chrome only feature (a non-standard extension of window.performance).
window.performance.memory
Browser support: Chrome 6+
2012 Answer
Is there a way to find out how much memory is being used by a web page, or by my jquery application? I'm looking for a runtime solution (not just developer tools), so that my application can determine actions based on memory usage in a user's browser.
The simple but correct answer is no. Not all browsers expose such data to you. And I think you should drop the idea simply because the complexity and inaccuracy of a "handmade" solution may introduce more problem than it solves.
Counting DOM elements or document size might be a good estimation, but it could be quite inaccurate since it wouldn't include event binding, data(), plugins, and other in-memory data structures.
If you really want to stick with your idea you should separate fixed and dynamic content.
Fixed content is not dependant on user actions (memory used by script files, plugins, etc.)
Everything else is considered dynamic and should be your main focus when determining your limit.
But there is no easy way to summarize them. You could implement a tracking system that gathers all these information. All operations should call the appropriate tracking methods. e.g:
Wrap or overwrite jQuery.data method to inform the tracking system about your data allocations.
Wrap html manipulations so that adding or removing content is also tracked (innerHTML.length is the best estimate).
If you keep large in-memory objects they should also be monitored.
As for event binding you should use event delegation and then it could also be considered a somewhat fixed factor.
Another aspect that makes it hard to estimate your memory requirements correctly is that different browsers may allocate memory differently (for Javascript objects and DOM elements).
You can use the Navigation Timing API.
Navigation Timing is a JavaScript API for accurately measuring performance on the web. The API provides a simple way to get accurate and detailed timing statistics—natively—for page navigation and load events.
window.performance.memory gives access to JavaScript memory usage data.
Recommended reading
Measuring page load speed with Navigation Timing
This question is 5 years old, and both javascript and browsers have evolved incredibly in this time. Since this now possible (in at least some browsers), and this question is the first result when you Google "javascript show memory useage", I thought I'd offer a modern solution.
memory-stats.js: https://github.com/paulirish/memory-stats.js/tree/master
This script (which you can run at any time on any page) will display the current memory useage of the page:
var script=document.createElement('script');
script.src='https://rawgit.com/paulirish/memory-stats.js/master/bookmarklet.js';
document.head.appendChild(script);
I don't know of any way that you could actually find out how much memory is being used by the browser, but you might be able to use a heuristic based on the number of elements on the page. Uinsg jQuery, you could do $('*').length and it will give you the count of the number of DOM elements. Honestly, though, it's probably easier just to do some usability testing and come up with a fixed number of tabs to support.
Use the Chrome Heap Snapshot tool
There's also a Firebug tool called MemoryBug but seems it's not very mature yet.
If you want to just see for testing there is a way in Chrome via the developer page to track memory use, but not sure how to do it in javascript directly.
I would like to suggest an entirely different solution from the other answers, namely to observe the speed of your application and once it drops below defined levels either show tips to the user to close tabs, or disable new tabs from opening. A simple class which provides this kind of information is for example https://github.com/mrdoob/stats.js .
Aside of that, it might not be wise for such an intensive application to keep all tabs in memory in the first place. E.g. keeping only the user state (scroll) and loading all the data each time all but the last two tabs are opening might be a safer option.
Lastly, webkit developers have been discussing adding memory information to javascript, but they have gotten in a number of arguments about what and what should not be exposed. Either way, it's not unlikely that this kind of information will be available in a few years (although that information isn't too useful right now).
Perfect question timing with me starting on a similar project!
There is no accurate way of monitoring JS memory usage in-app since it would require higher level privileges. As mentioned in comments, checking the number of all elements etc. would be a waste of time since it ignores bound events etc.
This would be an architecture issue if memory leaks manifest or unused elements persist. Making sure that closed tabs' content is deleted completely without lingering event handlers etc. would be perfect; assuming that it's done you could just simulate heavy usage in a browser and extrapolate the results from memory monitoring (type about:memory in the address bar)
Protip: if you open the same page in IE, FF, Safari... and Chrome; and than navigate to about:memory in Chrome, it will report memory usage across all other browsers. Neat!
What you might want to do is have the server keep track of their bandwidth for that session (how many bytes of data have been sent to them). When they go over the limit, instead of sending data via ajax, the server should send an error code which javascript will use to tell the user they've used too much data.
You can get the document.documentElement.innerHTML and check its length. It would give you the number of bytes used by your web page.
This may not work in all browsers. So you can enclose all your body elements in a giant div and call innerhtml on that div. Something like <body><div id="giantDiv">...</div></body>

JavaScript being loaded asynchronously in Firefox 3 (according to Firebug)?

I'm trying to profile the performance of a web site that I'm fairly confident is being slowed down by the loading of JavaScript files on the page.
The same JavaScript files are included several times on the page, and <script /> tags are scattered throughout the page instead of being included at the bottom.
As I suspected, when looking at FireBug's "Net" tab, most of the time (not all) when JavaScript is being loaded, no other files are requested. The browser waits for the JavaScript to complete loading.
There are a few exceptions however. There are a few occasions where JavaScript is loaded, but then at the same time, other resources appear to get loaded, such as other JavaScript files and images.
I always thought that JavaScript blocks the loading of other resources on the page. Am I incorrect in thinking this, or does this behavior vary depending on the browser or browser version?
UPDATE:
To those who have explained how loading a script blocks the loading of other resources, I'm already aware of this. My question is why a script wouldn't block the loading of other resources. Firebug is showing that some JavaScript files do not block loading other resources. I want to know why this would happen.
Javascript resource requests are indeed blocking, but there are ways around this (to wit: DOM injected script tags in the head, and AJAX requests) which without seeing the page myself is likely to be what's happening here.
Including multiple copies of the same JS resource is extremely bad but not necessarily fatal, and is typical of larger sites which might have been accreted from the work of separate teams, or just plain old bad coding, planning, or maintenance.
As far as yahoo's recommendation to place scripts at the bottom of the body, this improves percieved response times, and can improve actual loading times to a degree (because all the previous resources are allowed to async first), but it will never be as effective as non-blocking requests (though they come with a high barrier of technical capability).
Pretty decent discussion of non-blocking JS here.
I'm not entirly sure that Firebug offers a true reflection of what is going on within the browser. It's timing for resource loading seems to be good but I am not sure that it is acting as a true reflection of exactly what is going on. I've had better luck using HTTP sniffers/proxy applications to monitor the actual HTTP requests coming from the browser. I use Fiddler, but I know there are other tools out there as well.
In short, this many be an issue with the tool and not with how resources are actually being loaded ... at least worth ruling out.
I suppose you're using Firefox 3.0.10 and Firebug 1.3.3 since those are the latest releases.
The Firebug 1.4 beta has done many improvements on the net tab, but it requires Firefox 3.5. If you want to test it in Firefox 3.0, use one of the previous 1.4 alpha versions. But even with the improvements I still struggle to understand the result. I wish the Firebug developers would document more precisely what each part of a download means. It doesn't make sense to me why queuing is after connecting.
My conclusion was not to trust the results in Firebug, and ended up using the WebPageTest. Which was surprisingly good to come from AOL ;-)
Also, what kind of resources are being loaded at the same time as the javascript? Try tracing down the resources that are loaded at the same time, and see if it's referenced in a css/iframe/html-ajax. I'm guessing the reason why nothing else is loaded, is because the browser stops parsing the current HTML when it sees a script tag (without defer). Since it can't continue parsing the HTML, it has nothing more to request.
If you could provide a link to the page you're talking about. It would help everyone to give a more precise answer.
I believe that the content is downloadeded, but not rendered until the JavaScript is finished loading.
This is, from the server's POV, not much of a deal, but to the user it can make a huge difference in speed.
If you think about it a tag has to finish processing before you can continue to render content. What if the tag used document.write or some other wonderfully dumb thing? Until anything within the script tag has finished running the page can't be sure what it's going to display.
Browsers usually have a set number of connections opened to a single domain.
So, if you load all your scripts from the same domain you will usually load them one after the other.
But, if those scripts are loaded from several domains, they will be loaded in parallel.
The reason the browser is blocking during JavaScript downloading is that the browser suspects that there will be DOM nodes created inside the script.
For example, there might be "dcoument.write()" calls inside the script.
A way to hint to the browser that the script does not contain any DOM generation is with the "defer" attribute.
So,
<script src="script.js" type="text/javascript" defer="defer"></script>
should allow the browser to continue parallelizing the requests.
References:
http://www.w3.org/TR/REC-html40/interact/scripts.html#adef-defer
http://www.websiteoptimization.com/speed/tweak/defer/
As others have stated, the script is probably loading other resources through DOM injection.
Script.aculo.us actually loads its child components/scripts itself by doing this -- injecting other <script> tags into the DOM for them.
If you want to see whether or not this is the case, use Firebug's profiler and take a look at what the scripts are doing.
Like others said, one non-blocking way is to inject <script> tags in the page head.
But firefox can also execute loaded <script>s in parallel:
Copy the two lines below:
http://streetpc.free.fr/tmp/test.js
http://streetpc.free.fr/tmp/test2.js
Then go to this page, paste in the input textarea, click "JavaScript", then "Load Scripts" (which builds and adds a <script> child element to the head).
Try that in FF : you'll see "test2 ok", move the dialog box to see "test ok".
In other browsers, you should see "test ok" (with no other dialog behind) then "test2 ok", (except for Safari 4, showing me tes2 before test).
Firefox 3 has introduced connection parallelism feature to improve performance while loading a webpage, I bet this is the source of your problem ;)
When you open a web page that has many
different objects on it, like images,
Javascript files, frames, data feeds,
and so forth, the browser tries to
download several of them at once to
get better performance.
Here's the ZDNET blogpost about it.

Categories

Resources