MobileSafari crashing due to excessive memory consumption - javascript

I'm currently working on an application, that utilises SoundJS. I inherited the codebase after it was found not to work correctly on an iPad - the issue being it creates a manifest of around 16 MP3 files, totalling approximately 35.7mb. This obviously causes the iPad issues, and at 16mb it crashes.
The crash log shows it's due to the Per Process Limit according to the Diagnostics and Usage Logs.
I've done some digging in to the underlying structure of SoundJS and can see it's default behaviour is to utilise WebAudio, via a XHR. This is then parsed as an ArrayBuffer (or an array of ArrayBuffers).
At the moment this means that, after preloading, we have 35.7mb of data in ArrayBuffers - not good. This is after crunching the file size down! There will only ever be one audio file playing at any one time - and this is one file per section of the app; apart from during transitions, where two may be fading in to eachother.
Is there an easy way to free the resources up from the underlying structure; i.e the ArrayBuffers? As far as I'm aware, the previous developer did try using calls to the SoundJS .removeSound() method to free up some memory, but the results weren't good.
At the moment I'm looking at creating an object acting as a registry of all the filenames, and rather than loading them through a manifest - loading them individually and removing them as soon as they are used. However, I'm expecting this to cause headaches with easing one file in to another during playback. Furthermore, I expect that may actually result in a problem akin to the Images one where MobileSafari didn't release the memory allocated to image - even after deletion. (The correct fix being to reset the 'src' attribute of the image element prior to deletion)
Does anyone know of a surefire workaround for loading such large amounts of data in a web app targeting iPad?

testing SoundJS has shown some issues with iPad not releasing the memory properly. Unfortunately from a library perspective, there isn't much we can do about it.
If you only ever play sounds 1 at a time, I would suggest loading them only as you need them and removing them after use. The biggest issue you will find with that is waiting for sound to load, so you may want to implement a smart preload of the next sound you expect to use (meaning you always have the current and the next sound loaded). In theory this can keep you below the iPad 16 mb memory limit. However, if the iPad refuses to free up the memory, you may need to introduce some form of cache busting.
Another solution would be to reduce the file size through lossy compression, which it sounds like has already been attempted.
A third option might be implementing some form of streaming audio, but this is not something SoundJS can help with.
Hope that helps.

Related

ajax website memory usage accumulation / Chrome

I have an ajax/pushstate website, which loads data(pages) through ajax. In Chrome (perhaps other browsers also), I have noticed that memory consumption accumulates when navigating pages, especially pages with lots of images. You can check the website in mention here:
mjau-mjau.com
I know there is a lot of information about general javascript memory leakage and memory management, but this issue seems in most sense directly related to pages that contain a lot of images(often large ones). It is as if the image files don't get flushed from memory after their html context is replaced when navigating between pages.
Is there anything I might be overlooking? Shouldn't the browser automatically reclaim memory for html data that has been removed, including images? Could it be related to GPU layers somehow, when transitioning between pages with transform?
Suggestions are appreciated.
I tracked down this bug, and found that it only occurs in Chrome browser. Furthermore, it seems directly related to lazy loaded images and ajax pages. Loaded images are not being removed from memory when they are removed from DOM after an ajax call. I managed to find a hack which somehow clears them from memory before DOM is updated with new data.
$('#content').find("img").off().attr('src', '').remove();
I am not sure if all the above are required, but the combo works to fair degree. The strange thing is, the memory is not reclaimed until another page with images replaces the #content DOM element. Buggy memory management in Chrome!

HTML5 audio - currentTime attribute inaccurate?

I'm playing around a bit with the HTML5 <audio> tag and I noticed some strange behaviour that has to do with the currentTime attribute.
I wanted to have a local audio file played and let the timeupdate event detect when it finishes by comparing the currentTime attribute to the duration attribute.
This actually works pretty fine if I let the song play from the beginning to the end - the end of the song is determined correctly.
However, changing the currentTime manually (either directly through JavaScript or by using the browser-based audio controls) results in the API not giving back the correct value of the currentTime anymore but seems to set it some seconds ahead of the position that's actually playing.
(These "some seconds" ahead are based on Chrome, Firefox seems to completely going crazy which results in the discrepancy being way bigger.)
A little jsFiddle example about the problem: http://jsfiddle.net/yp3o8cyw/2/
Can anybody tell me why this happens - or did I just not getting right what the API should do?
P.S.: I just noticed this actually only happens with MP3-encoded files, OGG files are totally doing fine.
After hours of battling this mysterious issue, I believe I have figured out what is going on here. This is not a question of .ogg vs .mp3, this is a question of variable vs. constant bitrate encoding on mp3s (and perhaps other file types).
I cannot take the credit for discovering this, only for scouring the interwebs. Terrill Thompson, a gentlemen and scholar, wrote a detailed article about this problem back on February 1st, 2015, which includes the following excerpt:
Variable Bit Rate (VBR) uses an algorithm to efficiently compress the
media, varying between low and high bitrates depending on the
complexity of the data at a given moment. Constant Bit Rate (CBR), in
contrast, compresses the file using the same bit rate throughout. VBR
is more efficient than CBR, and can therefore deliver content of
comparable quality in a smaller file size, which sounds attractive,
yes?
Unfortunately, there’s a tradeoff if the media is being streamed
(including progressive download), especially if timed text is
involved. As I’ve learned, VBR-encoded MP3 files do not play back with
dependable timing if the user scrubs ahead or back.
I'm writing this for anyone else who runs into this syncing problem (which makes precise syncing of audio and text impossible), because if you do, it's a real nightmare to figure out what is going on.
My next step is to do some more testing, and finally to figure out an efficient way to convert all my .mp3s to constant bit rate. I'm thinking FFMPEG may be able to help, but I'll explore that in another thread. Thanks also to Loilo for originally posting about this issue and Brad for the information he shared.
First off, I'm not actually able to reproduce your problem on my machine, but I only have a short MP3 file handy at the moment, so that might be the issue. In any case, I think I can explain what's going on.
MP3 files (MPEG) are very simple streams and do not have absolute positional data within them. It isn't possible from reading the first part of the file to know at what byte offset some arbitrary frame begins. The media player seeks in the file by needle dropping. That is, it knows the size of the entire track and roughly how far into the track your time offset is. It guesses and begins decoding, picking up as soon as it synchronizes to the next frame header. This is an imprecise process.
Ogg is a more robust container and has time offsets built into its frame headers. Seeking in an Ogg file is much more straightforward.
The other issue is that most browsers that support MP3 do so only because the codec is already available on your system. Playing Ogg Vorbis and MP3 are usually completely different libraries with different APIs. While the web standards do a lot to provide a common abstraction, minor implementation details cause quirks like you are seeing.

HTML5 LocalStorage seems to become corrupt

I'm using BootUp.js (https://github.com/TradeMe/bootup.js) to load and store CSS and JavaScript files into HTML5 LocalStorage. The site is mobile focused so the time saving and speed boost this creates is great! However, I've noticed the odd occasion were the CSS (never noticed it with JS) becomes corrupt in the storage and so the site renders horribly until the storage is cleared and the CSS files are refetched from the server.
I've seen this happen very sporadically on Safari on an iPhone 4 (iOS 6), Chrome on a Galaxy S3 and Chrome on a Nexus 7 - so it doesn't seem to be limited to any particular device, browser or OS. Is this an issue any one has come across before? Is it possible that the data has just somehow become corrupt? Are there any known issues with WebKit (I guess) that could cause it?
I'm planning to implement a work-around by storing some kind of checksum that can be generated in JS to ensure the data is fully there. If not, clear it out and fetch from the server.
I'd first use this:
http://arty.name/localstorage.html
Mobile browsers tend to cut back on storage space due to obvious memory limitations. Your CSS and Javascript might be too big, even when minified.
Other thing I can think of for this behavior is that localStorage might become corrupt when starting a save, and refreshing the page at the same time. I'm not familiar with the exact works of browsers, but I'm guessing they might stop a save in the middle.
Also, have a look here:
http://hacks.mozilla.org/2012/03/there-is-no-simple-solution-for-local-storage/

Javascript memory and leak problems

My site is pretty standard ecom site, it isn't a JS backed standalone app or anything, it's just a site which uses JS for standard stuff, as well as some jquery plugins to do a few things.
I'm trying to do some JS memory evaluation on my site. I've done this by looking at the Chrome Task Manager and through Heap Snapshots.
Initailly my site on first load sits between 35MB (i.e 35,000K) and 40MB on the task manager. This is the largest of any tab, if I have several tabs of other websites open at the same time.
If I refesh the page it jumps up to 55-60, another refresh sees it jump to 65-70MB.
On a normal page in a workflow, it fluctuates between 45-65 (sometimes 75 depending on what you're doing). Clicking around and doing the workflow from page to page sees the memory jump up to 85-100, and increases as you continue through the site.
I've tried to do a few things like check for:
detacted nodes
heap snapshots & looking at the deltas
amix's MemoryLeakChecker checking size of objects
I'd need a deeper dive to look for circular references or closure problems.
Heap snapshots don't reveal much, most of the top lists are (array), (string), (system). The snapshots sit between 4.8MB, 5.1MB, 5.8MB, 6.8MB and increase.
I've got a few questions as result:
How do I understand the different metrics between snapshot memory and task manager memory
Are there any good tutorials (apart from the ones on the Google Developers site)?
How much memory is considered acceptable? Given in the task manager my site is always the highest?
Do I have a memory leak? Apart from the steps I've described above (which I haven't found anything concrete from) is there any other ways I can find leaks?
Can you suggest any tools apart from the Chrome Dev Tools (a lot of the tools mentioned on Google for Firefox are not compatible with the latest version, eg: Leak Monitor for FF)
As a side note, most of my functions are low key operations, and don't exceed 200ms (based on a CPU profile). What is a good benchmark I should be aiming for? Is 200ms high?
What you are describing is not a memory leak, it's a garbage that Chrome knows of and that will be removed whenever Chrome decides it's time to do it. To explain this, lets have a closer look at the scenario you have described.
Making memory to 'leak'
First lets open up a new incognito window (just to be sure that browser extensions are not affecting our results) and navigate to google.com.
Then, lets open the Task Manager and enable "JavaScript Memory" column (by right-clicking on the Task Manager window). We need this column to be sure that the memory we will be 'leaking' is being, in fact, allocated by JavaScript. We end up with something like this:
Now, as you suggested, we should reload the page couple of times and observe the memory of our tab going up:
So far, so good - everything works exactly as you described it.
Wait a second...
However, lave your cursor inactive for half a minute, or go to another tab and you will observe a huge memory usage drop on our 'Tab:google'. Why is that? What happened there? Who cleaned up our 'leaked' memory for us?
The Memory Usage Drop
To investigate that, lets repeat what we have done so far, so that 'Tab:google' uses a lot of memory again. Then, lets open Chrome Developer Tools and start recording on the 'Timeline' tab. After that, lets change a tab for couple of seconds and when memory drops stop 'recording' on the 'Timeline'. You should end up with this:
In the last couple of seconds of our recording mysterious 'GC Events' appeared. Exactly in the same time when the memory was released. Coincidence? Nope.
GC Events
GC stands for the Garbage Collector. It's a mechanism that "attempts to reclaim garbage, or memory occupied by objects that are no longer in use by the program". So it turns out that memory of our tab was polluted by garbage and GC was capable of getting rid of these garbage for the whole time (you can even force garbage collection using button at the bottom of the 'Timeline' tab). So why it decided not to? Why it waited for us to stop interacting with the page or change the tab?
Lazy Garbage Collector
The short answer is that garbage collection has to 'freeze' the execution of all scripts before any work can be done. Also, it can take significant amount of CPU time to execute. This can result in lag, choppy animations, unresponsive controls etc. That's why Chrome waits for the right moment to call the garbage collection. And the best moment to do it is when user is not looking.
In addition, please note that 'GC Events' come in series, there are always couple of them with short breaks in between. These breaks are meant for 'normal' JavaScript to execute making the garbage collection less noticeable.
Live Objects
Take a look at "JavaScript Memory" tab at the top two screenshots in this post again. You will notice that this column contains two numbers. First one is memory "reserved for JavaScript VM
heap", the other one is "how much memory live (reachable) objects
comprise" (source). When benchmarking your applications you should worry only about the second value, all the rest will be handled by GC.
An example of a leak
A real JavaScript leak can happen ie. in a web chat application. If, over time, it will use more and more 'live' memory while always displaying only last 10 messages then we can talk about a leak. Such leak, will eventually crash a tab (or a browser).
Conclusion
For scripts running on the page, reloading the page (or going to another location) is equal to restarting your computer while your ANSI C app is running. After that, you should think about all the memory allocated by your scripts as wiped out. The only reason why, in practice, this may not happen immediately after reloading the page is that browser is waiting for the right moment to clean up. And you, as a web developer, should not be concerned about it.
If you still think that your page are leaking you can use the answer from this question to track down the leaked objects.

jQuery or javascript to find memory usage of page

Is there a way to find out how much memory is being used by a web page, or by my jquery application?
Here's my situation:
I'm building a data heavy webapp using a jquery frontend and a restful backend that serves data in JSON. The page is loaded once, and then everything happens via ajax.
The UI provides users with a way to create multiple tabs within the UI, and each tab can contain lots and lots of data. I'm considering limiting the number of tabs they can create, but was thinking it would be nice to only limit them once memory usage has gone above a certain threshold.
Based on the answers, I'd like to make some clarfications:
I'm looking for a runtime solution (not just developer tools), so that my application can determine actions based on memory usage in a user's browser.
Counting DOM elements or document size might be a good estimation, but it could be quite inaccurate since it wouldn't include event binding, data(), plugins, and other in-memory data structures.
2015 Update
Back in 2012 this wasn't possible, if you wanted to support all major browsers in-use. Unfortunately, right now this is still a Chrome only feature (a non-standard extension of window.performance).
window.performance.memory
Browser support: Chrome 6+
2012 Answer
Is there a way to find out how much memory is being used by a web page, or by my jquery application? I'm looking for a runtime solution (not just developer tools), so that my application can determine actions based on memory usage in a user's browser.
The simple but correct answer is no. Not all browsers expose such data to you. And I think you should drop the idea simply because the complexity and inaccuracy of a "handmade" solution may introduce more problem than it solves.
Counting DOM elements or document size might be a good estimation, but it could be quite inaccurate since it wouldn't include event binding, data(), plugins, and other in-memory data structures.
If you really want to stick with your idea you should separate fixed and dynamic content.
Fixed content is not dependant on user actions (memory used by script files, plugins, etc.)
Everything else is considered dynamic and should be your main focus when determining your limit.
But there is no easy way to summarize them. You could implement a tracking system that gathers all these information. All operations should call the appropriate tracking methods. e.g:
Wrap or overwrite jQuery.data method to inform the tracking system about your data allocations.
Wrap html manipulations so that adding or removing content is also tracked (innerHTML.length is the best estimate).
If you keep large in-memory objects they should also be monitored.
As for event binding you should use event delegation and then it could also be considered a somewhat fixed factor.
Another aspect that makes it hard to estimate your memory requirements correctly is that different browsers may allocate memory differently (for Javascript objects and DOM elements).
You can use the Navigation Timing API.
Navigation Timing is a JavaScript API for accurately measuring performance on the web. The API provides a simple way to get accurate and detailed timing statistics—natively—for page navigation and load events.
window.performance.memory gives access to JavaScript memory usage data.
Recommended reading
Measuring page load speed with Navigation Timing
This question is 5 years old, and both javascript and browsers have evolved incredibly in this time. Since this now possible (in at least some browsers), and this question is the first result when you Google "javascript show memory useage", I thought I'd offer a modern solution.
memory-stats.js: https://github.com/paulirish/memory-stats.js/tree/master
This script (which you can run at any time on any page) will display the current memory useage of the page:
var script=document.createElement('script');
script.src='https://rawgit.com/paulirish/memory-stats.js/master/bookmarklet.js';
document.head.appendChild(script);
I don't know of any way that you could actually find out how much memory is being used by the browser, but you might be able to use a heuristic based on the number of elements on the page. Uinsg jQuery, you could do $('*').length and it will give you the count of the number of DOM elements. Honestly, though, it's probably easier just to do some usability testing and come up with a fixed number of tabs to support.
Use the Chrome Heap Snapshot tool
There's also a Firebug tool called MemoryBug but seems it's not very mature yet.
If you want to just see for testing there is a way in Chrome via the developer page to track memory use, but not sure how to do it in javascript directly.
I would like to suggest an entirely different solution from the other answers, namely to observe the speed of your application and once it drops below defined levels either show tips to the user to close tabs, or disable new tabs from opening. A simple class which provides this kind of information is for example https://github.com/mrdoob/stats.js .
Aside of that, it might not be wise for such an intensive application to keep all tabs in memory in the first place. E.g. keeping only the user state (scroll) and loading all the data each time all but the last two tabs are opening might be a safer option.
Lastly, webkit developers have been discussing adding memory information to javascript, but they have gotten in a number of arguments about what and what should not be exposed. Either way, it's not unlikely that this kind of information will be available in a few years (although that information isn't too useful right now).
Perfect question timing with me starting on a similar project!
There is no accurate way of monitoring JS memory usage in-app since it would require higher level privileges. As mentioned in comments, checking the number of all elements etc. would be a waste of time since it ignores bound events etc.
This would be an architecture issue if memory leaks manifest or unused elements persist. Making sure that closed tabs' content is deleted completely without lingering event handlers etc. would be perfect; assuming that it's done you could just simulate heavy usage in a browser and extrapolate the results from memory monitoring (type about:memory in the address bar)
Protip: if you open the same page in IE, FF, Safari... and Chrome; and than navigate to about:memory in Chrome, it will report memory usage across all other browsers. Neat!
What you might want to do is have the server keep track of their bandwidth for that session (how many bytes of data have been sent to them). When they go over the limit, instead of sending data via ajax, the server should send an error code which javascript will use to tell the user they've used too much data.
You can get the document.documentElement.innerHTML and check its length. It would give you the number of bytes used by your web page.
This may not work in all browsers. So you can enclose all your body elements in a giant div and call innerhtml on that div. Something like <body><div id="giantDiv">...</div></body>

Categories

Resources