This question already has an answer here:
Memory leaks when manipulating images in Chrome
(1 answer)
Closed 6 years ago.
I think I have a problem related to:
Systematically updating src of IMG. Memory leak
I don't have enough rep to comment on answers but https://stackoverflow.com/a/34085389/3270244 is exactly my case.
var canvasElement = $('canvas', camContainer);
var ctx = canvasElement[0].getContext('2d');
var image = new Image();
image.onload = function() {
ctx.drawImage(this, 0, 0);
image.src = '';
};
//for every getCamImg I receive exactly 1 image
socket.on('getCamImg', function(err, data) {
if(data) {
var dataImg = data.substring(data.indexOf(';') + 1);
image.src = dataImg;
}
socket.emit('getCamImg');
});
socket.emit('getCamImg');
I change img.src every 1/10s (jpegs from a camera) and I can watch the browsers consume more and more memory. Firefox stops at 500MB, Edge stops at 100MB and for Chrome I stopped testing near 1G. If I remove the img.src change everything runs smooth (without an image of course).
I found a lot of (at leat I think so) related issues:
https://bugs.chromium.org/p/chromium/issues/detail?id=36142
https://bugs.chromium.org/p/chromium/issues/detail?id=337425
memory leak while drawing many new images to canvas
Memory leaks when manipulating images in Chrome
Somewhere someone mentioned (sorry for this :D) that maybe the cache is spammed because the old images are kept. I don't think it's a gc problem, because chrome has a tool to run him and nothing changes.
Can someone reproduce this or guide me the correct way?
Update:
socket.on('getCamImg', function(err, data) {
if(data) {
var image = document.createElement("img");
image.onload = function() {
ctx.drawImage(this, 0, 0, ctx.canvas.width, ctx.canvas.height);
socket.emit('getCamImg');
image.src = '';
};
image.src = dataImg;
}
});
This works good in Firefox (the image.src='' is important). Chrome still leaks.
I'm doing nearly the same as you do in my current project. As this is too much for a comment I just share my observations in an answer. This is how I do it:
var canvas = document.getElementById("canvas"),
ctx = canvas.getContext("2d"),
onNewImgData = function (data) {
//create a new image element for every base64 src
var img = document.createElement("img");
//bind the onload event handler to the image
img.onload = function () {
//draw the image on the canvas
ctx.drawImage(this, 0, 0);
//do some stuff
//...
//done, request next image
};
//update the image source with given base64 data
img.src = "data:image/bmp;base64," + data;
};
I don't clean up anything and I can't see a memory leak in my application (no matter which browser). But I had a memory leak before, as I logged all the base64 data into the browser console :) That caused the exact same issue you've described.
Related
I have a problem when I load local (or online) images. So I use canvas HTML 5 then I put a new image on it but image never load at first time. I need to refresh page one time. The first time, images are loaded but not in good time so there are not showed. Then,, when I refresh, images are still in caches of my browser (firefox 50) so that's work.
I saw on forum that I need to wait with :
myimage.onload = function(){
context.drawImage(myimage,0,0,myimage_size,myimage_size);
}
myimage.src = mysrc;
But this doesn't work, images are not showed. So I try another way with that:
var elementHTML = document.createElement('img');
var canvas = document.createElement('canvas');
canvas.width = my_size;
canvas.height = my_size;
elementHTML.appendChild(canvas);
context = canvas.getContext('2d');
base_image = new Image();
base_image.src = base_image_src;
base_image.addEventListener('load', test(this.owner));
function test(owner){
//here there is the trick, with that, all work but I need program work without that !
while(window.alert("done")){};
context.drawImage(base_image,0,0,my_size,my_size)
if (owner == "1") {
//I modified some colors of my image
recolorRGBImage(context,lineTuleColorRGB,playerLineTuleColorRGB_1,lineTrigger);
recolorRGBImage(context,fillTuleColorRGB,playerFillTuleColorRGB_1,fillTrigger);
}
else {
recolorRGBImage(context,lineTuleColorRGB,playerLineTuleColorRGB_2,lineTrigger);
recolorRGBImage(context,fillTuleColorRGB,playerFillTuleColorRGB_2,fillTrigger);
}
}
That's work but obviously, alerts must never show. How can I do what I did without alert? (Alert used like onload because these methods doesn't work)
Thanks !
I'm in the process of creating an nw.js application that needs to show loads of PDF's. The PDF's are initially downloaded the first time you start the application. In the initialization phase I also need to create a thumbnail for each PDF to be shown in lists.
The thumbnail generation itself doesn't seem to be an issue when we had a few PDF's. It works by creating a canvas element and having PDF.js draw the first page, and then save the canvas to a PNG.
The issue is that PDF.js doesn't seem to unload the PDF between runs. Loading 20 1MB PDF files usually leads to nw.js using around 500MB RAM. Now we would have 100+, maybe even thousands of PDFs, so we need to figure out how to free the RAM between each thumbnail, since at around 80 or so PDFs, nw.js already uses 2GB of RAM and freezes my laptop as it runs out of memory.
I've made a simple test that shows this issue:
var fs = require("fs");
var Q = require("q");
var glob = require("glob");
var canvas = document.createElement("canvas");
var ctx = canvas.getContext('2d');
PDFJS.workerSrc = "pdf.worker.js";
function pdf(pdfFile) {
return new Q.Promise(function (fulfill, reject) {
PDFJS.getDocument(pdfFile).then(function (pdf) {
pdf.getPage(1).then(function (page) {
var viewport = page.getViewport(0.5);
canvas.height = viewport.height;
canvas.width = viewport.width;
var renderContext = {
canvasContext: ctx,
viewport: viewport
};
page.render(renderContext).then(function () {
//set to draw behind current content
ctx.globalCompositeOperation = "destination-over";
//set background color
ctx.fillStyle = "#ffffff";
//draw background / rect on entire canvas
ctx.fillRect(0, 0, canvas.width, canvas.height);
var img = canvas.toDataURL("image/png");
img = img.replace(/^data:image\/png;base64,/, "");
fs.writeFile(pdfFile + ".png", img, 'base64', function (err) {
console.log("Done thumbnail for: " + pdfFile);
fulfill();
});
});
});
});
});
}
glob("pdf/*.pdf", function (err, files) {
if (err) {
console.log(err);
} else {
function generate(file) {
console.log("Generating thumb for: " + file);
pdf(file).then(function() {
if(files.length > 0) next();
});
}
function next() {
var file = files.pop();
generate(file);
}
next();
}
});
I've never done anything like this before. I've tried to reuse the same canvas for all thumbs but that didn't seem to change a thing.
I've tried to do a heap snapshot in developer tools to see what takes up all the RAM, but guess what? It seems to trigger a garbage collection before doing the snapshot, so nw.js goes from 500MB to around 100MB before doing the snapshot. This makes me believe that the objects are actually marked for deletion but that the GC never has the chance to run before the computer runs out of RAM. Loading 20 files and then just wait doesn't trigger a GC though, and neither does running out of RAM.
I've tried to check the API and documentation of PDF.js, but I could not find anything mentioning how to unload a PDF before loading the next one.
Any ideas on how I should proceed? An idea I had was to call some external tool or make a c/c++ lib that I would call using node-ffi, but I'd have to use PDF.js to show the PDF's at a later state anyway and I'd imagine I would just run into the same issue again.
I'm still new in coding and HTML5.
I have a HTML5 stage with multiple canvas. I need to get the base64 image from that stage and resize it, then get the base64 image of the resized canvas.
This is the code I use:
stage.toDataURL({
callback: function(dataUrl) {
var tmp_img = new Image();
tmp_img.src = dataUrl;
//resize image to 45x75
var canvas = document.getElementById('hidden_canvas');
canvas.width = 45;
canvas.height = 75;
var ctx = canvas.getContext("2d");
ctx.drawImage(tmp_img, 0, 0, canvas.width, canvas.height);
dataUrl = canvas.toDataURL();
)};
The dataUrl is correct in Chrome but when I run the code in Firefox, it seems that the Firefox didn't generate the correct base64 code.
I would really appreciate any help
You've stumbled upon a common problem.
tmp_img.src loads the dataUrl into your new image--but that takes time. So javascript continues with your code below that even before the image is loaded. The result is that you are sometimes trying to ctx.drawImage before tmp_img has been fully loaded and is ready to use.
The fix is to refactor your code to use tmp_img.onload. The .onload triggers a callback function when tmp_img is finally ready to use. The refactoring might look like this:
var tmp_img = new Image();
tmp_img.onload=function(){
//resize image to 45x75
var canvas = document.getElementById('hidden_canvas');
canvas.width = 45;
canvas.height = 75;
var ctx = canvas.getContext("2d");
ctx.drawImage(tmp_img, 0, 0, canvas.width, canvas.height);
dataUrl = canvas.toDataURL();
}
tmp_img.src = dataUrl;
// more javascript
JavaScript will execute the above code in this order:
Create a new Image object
See tmp_img.onload and make a not that it should do that code when the image is fully loaded
Set tmp_img.src and start loading the image.
Continue with "more javascript"
When the image is fully loaded, execute everything in the .onload callback function.
It looks like you're using KineticJS--are you?
If so, then instead of stage.toDataURL you could use stage.toImage. That way you already have the image loaded when the stage.toImage callback is executed (you don't have to worry about waiting for the image to load). Then just .drawImage the image that KineticJS provides in the callback.
I want to pre-load a directory of images into my canvas so that they are ready to be displayed when I need them.
I found this question on stackoverflow
and the accepted answer was extremely helpful.
Their function iterates through an array of image locations and then (i presume) pre-loads them to the canvas.
I am having difficulties actually displaying them once they are pre-loaded using this function. I'm well aware of the single image loading method of-
img = new Image();
img.src = imgLocation;
img.onload = function() {
context.drawImage(img, 0, 0);
}
but since they are already loaded, I am assuming I don't need to use img.onload or any of the above code except for context.drawImage to actually display the image.
When I try to load an image from the (what i can only assume is the out-putted array with all the images in it) i get a type error. when I check what's inside the array ( with a console.log() ) at any index it says [object, Object]
My code code looks like this:
var framesArr = new Array();
var framesAmnt = 300; // we would normally get this number from the PHP side of things
function ldBtn(){
//load all frames locations into framesArr
var i=0;
for (i=0;i<=(framesAmnt-1);i++){
framesArr[i] = "frames/Frame" + i + ".png";
}
//preload them
preloadImages(framesArr, preloadDone);
}
var deferreds = [];
var deferred;
function preloadImages (paths, callback) {
$.each(paths, function (i, src) {
deferred = $.Deferred(),
img = new Image();
deferreds.push(deferred);
img.onload = deferred.resolve;
img.src = src;
});
$.when.apply($, deferreds).then(callback);
}
function preloadDone(){
//any old index number will do. *Shruggs*
context.drawImage(deferreds[2], 0, 0); //this will cause a type error.
}
How can I display images onto the canvas that were loaded by the preloadImages function?
I suppose I don't fully understand what happens when img = new Image(); img.src = imgLocation; img.onload = function() happens. What does javascript/jquery store the image as? if the answer is 'an object' then why won't it load the objects in this array?
Thank you in advance for any advice you can provide.
First of all, you need to pass an actual image element to the drawImage method. Now you're passing a deferred in.
Next, there's a bug with Chrome and native elements constructor (new Image()).
So, simply use document.createElement("img") which will do exactly the same thing - without the bug.
See the related bugs:
About Canvas:
https://code.google.com/p/chromium/issues/detail?id=238071
The root cause: https://code.google.com/p/chromium/issues/detail?id=245296
I created a reduced example here: http://jsfiddle.net/YVDpD/
Systematically updating src of IMG. Memory leak.
I am currently updating an image every x sec. A few ways I thought of doing this were as follows:
Take One:
var url ="...";
$('#ImageID').attr('src', url);
Now this works perfectly changes the image but causes a memory leak.
Take Two:
So it is creating DOM elements, so I attempted the following.
<div id="ImageHolder">
</div>
var image - "..."; //Obv actual image in working.
$('#ImageHolder').empty();
$('#ImageHolder').html(image);
Now this works but causes a flicker when it changes which is unliked. Now with two images and swapping them at intervals works fine, but I want to stay as low on bandwidth as possible.
Edit 1:
My Code:
<form name="selected">
<input type="hidden" name="map" />
</form>
<img id="MyMaps" src="http://localhost/web/blank.png" alt="" />
<script type="text/javascript">
var locate = window.location;
var uri = document.selected.map.value;
var MyHost = window.location.hostname;
function delineate2(name) {
totheleft= uri.indexOf("maps/") + 5;
totheright= uri.lastIndexOf("&");
return (uri.substring(totheleft, totheright));
}
function loadImage() {
var CurrentMap = delineate2(name);
var url = 'http://' + MyHost+ '/map/' + CurrentMap+ '/image?' + new Date().getTime();
$('#MyMaps').attr('src', url);
setTimeout(loadImage, 10000);
}
</script>
Has anyone done something similar and found a working solution, or how can I go about preventing the memory leak / flickering when the image updates?
I believe that your "take one" should work. There should be no memory leak. You're overwriting the src tag every time around - if you hold no other references to old images, they should get garbage collected. I'm seeing this problem in FF and Chrome. Chrome tells me that JS memory usage is constant, the memory must be lost somehwere else.
I have opened a Chrome bug:
https://code.google.com/p/chromium/issues/detail?id=309543
In case you want to put in your weight as well and maybe star the bug :)
I have used different methods to solve this problem, none of them works. It seems that memory leaks when img.src = base64string and those memory can never get released. Here is my solution.
fs.writeFile('img0.jpg', img_data, function (err) {
// console.log("save img!" );
});
document.getElementById("my-img").src = 'img0.jpg?'+img_step;
img_step+=1;
My Electron app updating img every 50ms, and memory doesn't leak.
Forget about disk usage. Chrome's memory management piss me off.
I have never thought of doing this like your fist method.. interesting. I can imagine that it causes a memory leak because every single image is kept in memory because nothing is actually removed. Thats just a guess though.
I would recomend sticking to the second method but modifying it so solve the flicker, like fading between images. A good jQuery plugin to look at would be the jQuery Cycle Plugin
If that plugin doesn't do it for you or you want to keep the code small, jQuery also has some animation functions built in. fadeIn() and fadeOut() may be of interest.
Something like this might work better.
<div id="ImageHolder">
</div>
var image - "..."; //Obv actual image in working.
function loadImage() {
choose your image however you want to, preferably a preloaded image.
$('#ImageHolder').fadeOut('fast');
$('#ImageHolder').html(image);
$('#ImageHolder').fadeIn('fast');
setTimeout(loadImage, 10000);
}
I believe a shorter way to do this might be: (also delay() may be optional, I just put it there in case you need it.)
$('#ImageHolder').fadeOut('fast').html(image).delay('100').fadeIn('slow');
Additionally there may be a delay for the image to load if it hasn't been preloaded. I'm not 100% sure how to do that off the top of my head so a quick google seach came up with this:
http://engineeredweb.com/blog/09/12/preloading-images-jquery-and-javascript
5 years old question yet it still 'hot' for me, I want to share the very same problem I just faced.
The "take one" approach maybe is the very first approach every programmer used to change the image source, but till now ( 5 years after this question posted ) the problem is still occurred, change the <img> 'src' frequently and you can see on windows task manager that your browser became greedy.
The "take two" create flicker, and it is rarely acceptable.
Fortunately, html5 comes with <canvas>, so I try to use <canvas> to overcome this problem.
var canvas = document.getElementById("mycanvas");
var ctx = canvas.getContext("2d");
img = new Image();
img.onload = function () {
ctx.drawImage(img, 0, 0);
img.src = "";
}
img.src = "data:image/png;base64," + stringSource;
The new problem found, <canvas> different from <img>, it will not automatically resize and 'fit'. We have to manually resize the image to fit the canvas and keep the ratio. Below is my code to resolve the problem
var canvas = document.getElementById("mycanvas");
var ctx = canvas.getContext("2d");
img = new Image();
img.onload = function () {
var imageWidth = canvas.width;
var imageHeight = canvas.height;
var xPosition = 0;
var yPosition = 0;
var proportion = 1;
var padding = 2;
if ((img.width / canvas.width) > (img.height / canvas.height)) {
proportion = img.width / canvas.width;
}
else {
proportion = img.height / canvas.height;
}
imageWidth = img.width / proportion;
imageHeight = img.height / proportion;
xPosition = (canvas.width - imageWidth) / 2;
yPosition = (canvas.height - imageHeight) / 2;
ctx.drawImage(img, 0, 0, img.width, img.height, xPosition+padding, yPosition+padding, imageWidth-2*padding, imageHeight-2*padding);
img.src = "";
}
img.src = "data:image/png;base64," + stringSource;
Hope it will help any one who face the same problem.