Memory leak in canvas with getImageData and or putImageData - javascript

I am creating a realtime video editor that allows you to add certain video effects using the canvas. You'll notice in my code, I actually have two canvas elements. The process is:
<video> begins to play invisibly using CSS.
The visible <canvas> renders the video onto the canvas.
In order to add the realtime effects, I need to have another hidden <canvas> that will process the video image and then return that processed frame to the visible <canvas>. This is where I use getImageData() and putImageData()
I am seeing a memory leak in chrome right now. I have found many threads talking about this, but I haven't been able to find a solution.
I removed the "video effects" code for testing purposes and for the fiddle. It still memory leaks without the effect processing.
FIDDLE: http://jsfiddle.net/o8z4ocLd/
Watch the Memory using Chromes Task Manager (Burger Menu -> More Tools -> Task Manager)
You will notice the memory climb and then fall back down, only to rise higher than the last time. I have watched it climb well into 1+GB of memory usage to eventually crash the tab. I have tried different variations of setTimeout and requestAnimationFrame. All seem to result in a memory leak.
edit
one thing worth mentioning is this seems to be happening in only chrome. Firefox seems to handle the garbage collecting better. I haven't seen firefox go above 500mb.

So my chrome self updated to version 45 this morning. 45 came out sept. 1st 2015 and this issue appears to be fixed! I am no longer receiving a memory leak. This update has also fixed another issue I was having in this thread:
Canvas is stretching using drawImage

Related

Chrome tab crashes when loading a lot of images in Javascript

I have a Javascript image sequence object that uses one <canvas> tag in the DOM, calling clearRect and drawImage quickly to play the sequence. There are 3 different sequences consisting of 1,440 images each, only one sequence needs to be loaded at a time, but having them all queued up will make the experience faster and smoother.
The images are pretty big in dimension, 8680x1920 each, about 1.5mb each as JPG. I have buttons that load each set individually instead of all at once. Everything is fine loading in the first sequence set, but the second one crashes (Aw Snap page) in Chrome 51 in Windows 7 Business.
Dev is happening on my Mac Pro and works perfectly, letting me load all 3 sequences just fine. The specs of my Mac Pro are far lower than the PC. The PC is an i7 quad core, 32gb RAM, 2x M5000 Nvidia Quadro cards with a Sync card. My understanding is that Chrome isn't even utilizing most of those advanced pieces of hardware, but we need them for other parts.
I have tried setting the existing image objects to an empty source then setting them to null before loading in the next sequence, I have also tried removing the <canvas> tag from the DOM, but nothing seems to help. I also find that watching Chrome's Network tab shows the crashes to always happen just after 1.5gb has been transferred. Chrome's Task Manager has the tab hovering around 8gb of memory usage on both Windows and Mac with 1 sequence loaded.
This is an obscure, one-off installation that will be disconnected from the internet, so I'm not concerned so much about security concerns or best practices, just getting it to work through any means necessary.
UPDATED to reflect that I had recently changed the <img> tag to a <canvas> tag for performance reasons
You should not be loading the entire sequence at once. You're most likely running out of RAM. Load only a few frames ahead using Javascript in memory, then assign that image to your image tag. Be sure to clear that look ahead cache by overwriting the variables or using the delete operator.
Secondly, changing the src attribute will cause the entire DOM to redraw. This is because when the src attribute changes, the image is assumed to have possibly changed size, which will cause all elements after might have shifted and need redrawing.
It's a better idea to set the image as the background of a <div> and update the background-image styles. You can also write the image to a <canvas>. In both cases only element needs redrawing.
Finally, a <video> tag would probably be your best option since it's designed to handle frame sequences efficiently. In order to make it possible to scrub to individual frames without lag you can either encode with the keyframe every 1 frames setting, or simply encode the video in an uncompressed format that doesn't use keyframes. A keyframe is like snapshot at a particular interval in a video, all subsequent frames only redraw the parts that have changed since the last keyframe. So if keyframes are far apart, seeking to a particular frame requires the the keyframe be rendered, then all the subsequent frames in between be added to it to get the final image of the frame you're on. By putting a keyframe on every frame, it will make the video larger since it can't use the differential compression, but it will seek must faster.

What's the workflow for determining if a bug is the browser's fault, or my own?

I'm working on a project right now and I've written some code using Pixi.js that produces strange results in Google Chrome. Specifically, drawing a sprite with a texture seems to be creating a strange issue where multiple loaded textures are drawn on top of each other, when only one was requested. (e.g. I say "load a cat, load a dog, draw a cat" and for some reason I see a cat on top of a dog.)
I don't see this issue in Firefox or in Safari. I'm not sure if this is my own bug, a bug in Pixi.js, or a bug in the browser. It doesn't really matter, because that's not really what this question is about-- I'm just telling this story for context.
My question is: what is the general workflow for determining whether or not a bug is my own, or a problem with the browser? Is there some standard process for debugging browsers?
I'm not sure if there's a standard process, but from my experience with PIXI, I've found that when the bug is in my code it generally shows up in all the browsers.
Browsers often have differences in HTML/CSS, but they seem to display the canvas the same. So if the issue is with the general layout of the entire canvas or other DOM elements, I would assume it's a browser issue.
But if the problem is with rendering a PIXI component on the stage, it is more likely either a PIXI bug or a bug in your code. Keep in mind PIXIjs renders using WebGL if available, otherwise it falls back on the HTML5 canvas. So I would check that first by turning WebGL on and off within the same browser and see if it makes a difference.
If you're curious to know why your dog didn't render correctly, go ahead and post some code :)

Chrome canvas becomes blank after waking up from standby mode

Our application downloads about 15meg of images and displays them in a html canvas. We are doing a bit of stress testing and have found that after we have about 10 tabs open if we put the computer in sleep mode, when it comes back the canvas is blank - it just shows plain white (this doesn't happen every time, but very frequently).
We hold the images in JavaScript Image objects, and I have inspected the memory in those and they still appear to be valid. I've tried to use the Chrome memory analysis by taking a snap shot before and after the error occurs, in some cases less memory is being used, in other cases more, so that didn't seem to tell me much.
I am curious if anyone has seen this before, and even if not, does anyone have pointers about debugging something similar. It would be perfectly sufficient if there was a way for us to determine if the error had occurred so we could trigger a reload of the images, but I'm afraid until I figure out what is causing it, I won't even know what to try and inspect.
#rtpg - the draw loop was continuing to run, but it would not display anything
For some reason the canvas would no longer update. I was unable to determine when the problem occurred but did figure out how to get the canvas to start displaying the images again. It was required that I resize the canvas (I change the width by one pixel) and then redraw (it's embarrassingly hackish). I current have it set to run every 30 seconds through setTimeout, but will probably change to window.onfocus once I can verify that gets called when coming out of sleep mode.
There are major canvas issues in Chrome 29.
You may want to check out and star this issue:
https://code.google.com/p/chromium/issues/detail?id=280153
(This stress test is failing too and may be related)

HTML5 canvas performance on small vs. large files

I seem to be experiencing varying performance using an HTML5 canvas based on the memory size of the page... perhaps the number of images (off-screen canvases) that are loaded. How to I best locate the source of the performance problem? Or does anyone know if in fact there is a performance issue when there's a lot of data loaded, even if it isn't all being used at once?
Here's an example of good performance. I have a relatively simple map. It's between 700 and 800 KB. You can drag to scroll around this map relatively smoothly.
There's another file (which you may not want to look at due to its large size).
It's about 16 MB and contains dozens, maybe on the order of a hundred images/canvases. It draws a smaller view so it should go faster. But it doesn't. Many of the maps lag quite severely compared to the simpler demo.
I could try to instrument the code to start timing things, but I have not done this in JavaScript before, and could use some help. If there are easier ways to locate the source of performance problems, I'd be interested.
In Google Chrome and Chromium, you can open the developer tools (tools->developer tools) and then click on "Profiles". Press the circle at the bottom, let the canvas redraw and then click on the circle again. This gives you a profile that shows how much time was spent where.
I've been working on some complex canvas stuff where rendering performance mattered to me.
I wrote some test cases in jsperf and came to the conclusion that a rule of thumb is that a source offscreen canvas should never be more than 65536 pixels.
I haven't yet come to a conclusion about why this is, but likely a data structure or data type has to be changed when dealing with large source canvases.
putImageData showed similar results.
destination canvas size didn't seem to matter.
Here are some tests I wrote that explore this performance limitation:
http://jsperf.com/magic-canvas/2
http://jsperf.com/pixel-count-matters/2

Unloading Resources on HTML with JavaScript

I'm working on a HTML 5 game, it is already online, but it's currently small and everything is okay.
Thing is, as it grows, it's going to be loading many, many images, music, sound effects and more. After 15 minutes of playing the game, at least 100 different resources might have been loaded already. Since it's an HTML5 App, it never refreshes the page during the game, so they all stack in the background.
I've noticed that every resource I load - on WebKit at least, using the Web Inspector - remains there once I remove the <img>, the <link> to the CSS and else. I'm guessing it's still in memory, just not being used, right?
This would end up consuming a lot of RAM eventually, and lead to a downgrade in performance specially on iOS and Android mobiles (which I slightly notice already on the current version), whose resources are more limited than desktop computers.
My question is: Is it possible to fully unload a Resource, freeing space in the RAM, through JavaScript? Without having to refresh the whole page to "clean it".
Worst scenario: Would using frames help, by deleting a frame, to free those frames' resources?.
Thank you!
Your description implies you have fully removed all references to the resources. The behavior you are seeing, then, is simply the garbage collector not having been invoked to clean the space, which is common in javascript implementations until "necessary". Setting to null or calling delete will usually do no better.
As a common case, you can typically call CollectGarbage() during scene loads/unloads to force the collection process. This is typically the best solution when the data will be loaded for game "stages", as that is a time that is not time critical. You usually do not want the collector to invoke during gameplay unless it is not a very real-time game.
Frames are usually a difficult solution if you want to keep certain resources around for common game controls. You need to consider whether you are refreshing entire resources or just certain resources.
All you can do is rely on JavaScript's built in garbage collection mechanism.
This kicks in whenever there is no reference to your image.
So assuming you have a reference pointer for each image, if you use:
img.destroy()
or
img.parentNode.removeChild(img)
Worth checking out: http://www.ibm.com/developerworks/web/library/wa-memleak/
Also: Need help using this function to destroy an item on canvas using javascript
EDIT
Here is some code that allows you to load an image into a var.
<script language = "JavaScript">
var heavyImage = new Image();
heavyImage.src = "heavyimagefile.jpg";
......
heavyImage = null; // removes reference and frees up memory
</script>
This is better that using JQuery .load() becuase it gives you more control over image references, and they will be removed from memory if the reference is gone (null)
Taken from: http://www.techrepublic.com/article/preloading-and-the-javascript-image-object/5214317
Hope it helps!
There are 2 better ways to load images besides a normal <img> tag, which Google brilliantly discusses here:
http://www.youtube.com/watch?v=7pCh62wr6m0&list=UU_x5XG1OV2P6uZZ5FSM9Ttw&index=74
Loading the images in through an HTML5 <canvas> which is way way faster. I would really watch that video and implement these methods for more speed. I would imagine garbage collection with canvas would function better because it's breaking away from the DOM.
Embedded data urls, where the src attribute of an image tag is the actual binary data of the image (yeah it's a giant string). It starts like this: src="data:image/jpeg;base64,/9j/MASSIVE-STRING ... " After using this, you would of course want to use a method to remove this node as discussed in the other answers. (I don't know how to generate this base64 string, try Google or the video)
You said Worst scenario: Would using frames help, by deleting a frame, to free those frames' resources
It is good to use frame. Yes, it can free up resource by deleting the frames.
All right, so I've made my tests by loading 3 different HTML into an < article > tag. Each HTML had many, huge images. Somewhat about 15 huge images per "page".
So I used jQuery.load() function to insert each in the tag. Also had an extra HTML that only had an < h1 >, to see what happened when a page with no images was replacing the previous page.
Well, turns out the RAM goes bigger while you start scrolling, and shoots up when going through a particularly big image (big as in dimensions and size, not just size). But once you leave that behind and lighter images come to screen, the RAM consumption actually goes down. And whenever I replaced using JS the content of the page, the RAM consumption went really down when it was occupying to much. Virtual Memory remained always high and rarely went down.
So I guess the browser is quite smart about handling Resources. It does not seem to unload them if you leave it there for a long while, but as soon you start loading other pages or scrolling, it starts loading / freeing up.
I guess I don't have anything to worry about after all...
Thanks everyone! =)

Categories

Resources