I have written a "slide show" that displays images sequentially and quickly, much like a stop-frame movie.
Once an image is displayed, I have no further use for it and would like to clear it from memory to free space for new images. However, while monitoring Google Chrome Helper in Activity Monitor, I see that the memory continues to increase steadily until the browser crashes.
I noticed a chrome garbage collection issue that was submitted as a bug and I'm wondering if maybe I'm experiencing this?
Otherwise, here is one example of a trick I tried based on this post to get Chrome to trash my old image data.
function ClearChunk()
{
imageSet1 = null; // garbage collect please?
imageSet1 = [];
}
function LoadNewChunk()
{
for (i=start_of_chunk;i<end_of_chunk;i++)
{
imageSet1[i-start_of_chunk] = new Image();
imageSet1[i-start_of_chunk].src = img[i];
}
}
This clears first and then loads in the background, all while another array of images are being displayed. It seemed like a good idea at the time, but on my machine it still climbs steadily to about 3Gb and... Aw, Snap.
How to mitigate this rampant memory consumption in the first place?
Any conversational or code-based feedback would be appreciated.
Try doing only one 'new Image()', and reuse that, instead of creating many in the loop. This helped me solve the same issue.
webworkers ? perhaps ?
==============
recently saw something about....
if you declare a variable like...
var tempa = 0;
vs not actually declaring the variable. but just assigning it something.
var tempa = 0; // DO NOT DO
temp_no_var = 0; // there is some sort of "delete" ability that removes variable from memory or rather reduces memory space.
perhaps after image is used simply assining it null or 0. vs leaving the image data in a variable and hopefully waiting for garbage collection.
==================
re-using variables, vs making huge arrays that just keep on growing and growing.
==================
check out imagemagik and see about creating a "gif" animated picture. if information in the movie clip errr set of images, does not change.
Related
Sorry if duplicate, but couldn't find my exact case. I'm playing around Web Worker and is pretty interesting. I was testing different cases and hit this.
Main :
var myWorker = new Worker("WK.js");
for (var i = 0; i <= 1000000000; i++) {
myWorker.postMessage(i);
}
myWorker.onmessage = function (e) {
alert(e.data);
}
Worker :
var sum = 0;
self.onmessage = function (e) {
if (e.data == 1000000000) { postMessage("done" + sum); }
sum += e.data;
}
On the worker script, I'm just summing up the passed values and post back the sum once done. The problem I face is, the above code crashes my browser(all) for this number(~1000000000) however if I move that loop to worker script, it works fine. So is there a limit for the number of postMessage calls per duration? Please note I do know this is bad code, just for testing.
Your browser may not have crashed. The problem is you have a later for loop that gets executed in the main ui that if the browser. So basically the browser is busy busy ruining your code, and can't find to any user input, I.e. It is busy, and generally this results in No responding' message in windows. The rain you don't get this in a worker is simply because that code executes in a completely separate thread (non ui).
It could be a memory / garbage collection issue. You're posting a billion messages, each at least the size of an integer, which I think in Javascript is stored just as all other numbers, so 8 bytes. Ignoring any extra overhead per message, this means that it needs to allocate at least 8gb of memory.
I have to admit a level of ignorance of the garbage collector, but it might not be able to keep up with a billion objects using 8gb of memory allocated in a short amount of time.
So is there a limit for the number of postMessage calls per duration
I suspect yes, although perhaps it's not clear what you mean by "duration" here.
I have this piece of canvas animation that is exhibiting some weird characteristics:
http://jsbin.com/olasol/2/edit
I'm on the latest version of Chrome. I'm using Chrome's inbuilt FPS monitor (you can activate it by going to about:flags).
I have marked the line in the JavaScript section which I think is the potential culprit:
fallingctx.clear();
This line does nothing special. It calls a function which in-turn calls clearRect().
The "weird" things I notice are:
The clear(); function causes very noticeable FPS drop on my laptop (Core 2 Duo), but not on my desktop (i5 2500k).
Removing that line alone is sufficient to produce 60fps on my laptop as well. As expected, the canvas doesn't clear after each frame, but still, it produces stable 60fps.
The FPS drop happens only when my Chrome window is on the larger side! When I shrink the window and reload, it doesn't happen! (Is it more expensive to clear a larger rectangle?).
I tried replacing the clear() with a drawImage() of a full white JPEG to cover the canvas. The JavaScript is able to do 200 drawImage() executions each cycle for the smaller image particles (evident from the second point). However, when I add one single drawImage for the overall canvas, it lags again! (Make sure the output occupies the entire screen in order to reproduce the result.)
Why is all this happening? How do I fix it?
It really depends on the hardware, but think about what the invocation of clearRect has to do! Something essentially must zero-out a piece of memory large enough to handle the canvas contents. That can be costly. Think about how much memory has to hold RGBA at HD resolutions... That's over two million pixels of data, around 8 MB in bytes Admittedly, it's not all that much these days in general, but if there's any bandwidth or caching issues related to pushing memory to the video card or something you are doing 60 times a second... well, expect problems.
What I've heard often works is just to clear around where the image is formerly drawn. See http://jsbin.com/olasol/6/edit
I made the following changes for you.
for (var i=0; i< noOfDrops; ++i)
{
fallingctx.clearRect(
fallingDrops[i].x-1,
fallingDrops[i].y-1,
fallingDrops[i].image.width+2,
fallingDrops[i].image.height+2);
}
for (var i=0; i< noOfDrops; ++i)
{
fallingDrops[i].y += fallingDrops[i].speed; //Set the falling speed
fallingctx.drawImage (fallingDrops[i].image, fallingDrops[i].x, fallingDrops[i].y);
}
There's probably a good reason that I need to clearRect around where the image was rendered but a simple reason escapes me. (It is something to do with things being rendered not quite at the pixel specified... I forget exactly).
You also need to do something about the fact you are starting the render loop before the image is loaded (also in the jsbin) so I added
var imgSource = "http://lorempixel.com/20/20/sports/";
var imgObj = new Image();
and replaced superinit
function superinit()
{
imgObj.onload = function(){
flowerfallsetup();
requestAnimFrame(flowerfall);
}
imgObj.onerror = function (){
alert("could not load image");
}
imgObj.src = imgSource;
}
Edit: I forgot to mention because of the prior image setup, I did change the line in your flowerfallsetup :
fallingDr["image"] = imgObj;
There are many ways to handle the asynchronous loading of images, I just chose one that was easy for this example.
Edit: I have to confess, there might be a bit more to this. It works fine on desktop browsers, but on the iPhone, there are clipping issues. If I can figure out what's causing the problem I'll try to post an update.
I'd like to continuously execute a piece of JavaScript code on a page, spending all available CPU time I can for it, but allowing browser to be functional and responsive at the same time.
If I just run my code continuously, it freezes the browser's UI and browser starts to complain. Right now I pass a zero timeout to setTimeout, which then does a small chunk of work and loops back to setTimeout. This works, but does not seem to utilize all available CPU. Any better ways of doing this you might think of?
Update: To be more specific, the code in question is rendering frames on canvas continuously. The unit of work here is one frame. We aim for the maximum possible frame rate.
Probably what you want is to centralize everything that happens on the page and use requestAnimationFrame to do all your drawing. So basically you would have a function/class that looks something like this (you'll have to forgive some style/syntax errors I'm used to Mootools classes, just take this as an outline)
var Main = function(){
this.queue = [];
this.actions = {};
requestAnimationFrame(this.loop)
}
Main.prototype.loop = function(){
while (this.queue.length){
var action = this.queue.pop();
this.executeAction(e);
}
//do you rendering here
requestAnimationFrame(this.loop);
}
Main.prototype.addToQueue = function(e){
this.queue.push(e);
}
Main.prototype.addAction = function(target, event, callback){
if (this.actions[target] === void 0) this.actions[target] = {};
if (this.actions[target][event] === void 0) this.actions[target][event] = [];
this.actions[target][event].push(callback);
}
Main.prototype.executeAction = function(e){
if (this.actions[e.target]!==void 0 && this.actions[e.target][e.type]!==void 0){
for (var i=0; i<this.actions[e.target][e.type].length; i++){
this.actions[e.target][e.type](e);
}
}
}
So basically you'd use this class to handle everything that happens on the page. Every event handler would be onclick='Main.addToQueue(event)' or however you want to add your events to your page, you just point them to adding the event to the cue, and just use Main.addAction to direct those events to whatever you want them to do. This way every user action gets executed as soon as your canvas is finished redrawing and before it gets redrawn again. So long as your canvas renders at a decent framerate your app should remain responsive.
EDIT: forgot the "this" in requestAnimationFrame(this.loop)
web workers are something to try
https://developer.mozilla.org/en-US/docs/DOM/Using_web_workers
You can tune your performance by changing the amount of work you do per invocation. In your question you say you do a "small chunk of work". Establish a parameter which controls the amount of work being done and try various values.
You might also try to set the timeout before you do the processing. That way the time spent processing should count towards any minimum the browsers set.
One technique I use is to have a counter in my processing loop counting iterations. Then set up an interval of, say one second, in that function, display the counter and clear it to zero. This provides a rough performance value with which to measure the effects of changes you make.
In general this is likely to be very dependent on specific browsers, even versions of browsers. With tunable parameters and performance measurements you could implement a feedback loop to optimize in real-time.
One can use window.postMessage() to overcome the limitation on the minimum amount of time setTimeout enforces. See this article for details. A demo is available here.
On a web page I have a quite large list of items (say, product cards, each contains image and text) - about 1000 of them. I want to filter this list on client (only those items, which are not filtered away should be shown), but there is a rendering performance problem. I apply a very narrow filter and only 10-20 items remain, then cancel it (so all items have to be shown again), and browser (Chrome on very nice machine) hangs up for a second or two.
I re-render the list using following routine:
for (var i = 0, l = this.entries.length; i < l; i++) {
$(this.cls_prefix + this.entries[i].id).css("display", this.entries[i].id in dict ? "block" : "none")
}
dict is the hash of allowed items' ids
This function itself runs instantly, it's rendering that hangs up. Is there a more optimal re-render method than changing "display" property of DOM elements?
Thanks for your answers in advance.
Why load 1000 items? First you should consider something like pagination. Showing around 30 items per page. that way, you are not loading that much.
then if you are really into that "loop a lot of items", consider using timeouts. here's a demo i had once that illustrates the consequences of looping. it blocks the UI and will cause the browser to lag, especially on long loops. but when using timers, you delay each iteration, allowing the browser to breathe once in a while and do something else before the next iteration starts.
another thing to note is that you should avoid repaints and reflows, which means avoid moving around elements and changing styles that often when it's not necessary. also, another tip is to remove from the DOM the nodes that are not actually visible. if you don't need to display something, remove it. why waste memory putting something that isn't actually seen?
You can use the setTimeout trick that offloads the loop calls from the main thread and avoids the client freeze. I suspect that the total processing – from start to finish – would last the same amount of time, but at least this way the interface can still be used and the result is a better user experience:
for (var i = 0, l = this.entries.length; i < l; i++) {
setTimeout(function(){
$(this.cls_prefix + this.entries[i].id).css("display", this.entries[i].id in dict ? "block" : "none")
}, 0);
}
Dude - the best way to handle "large amounts of DOM elements" is to NOT do it on the client, and/or DON'T use Javascript if you can avoid it.
If there's no better solution (and I'm sure there probably is!), then at LEAST partition your working set down to what you actually need to display at that moment (instead of the whole, big, honkin' enchilada!)
A while ago I created a small cardgame web app for fun. The player plays against the computer and mostly it works fine. Sometimes though the computer player gets into a loop, the point of the game is to lose all your cards and if you don't have a card to play you take the pile. Sometimes the computer plays x,y,z, takes the pile, plays x,yz, takes the pile etc.
I keep track of the moves I've made, so at any point I have an array that looks something like : [C2,D5,H2,S4,C5,H2,S4,C5,H2,S4,C5]
In this case I can see that I've gotten into a loop of playing H2,S4,C5, then taking the pile and then repeating.
So, the generalized problem is, what's the best way to detect repeating patterns in a list? I could probably whip something up using a simple for loop, trying to find the card I'm about to play and if I find that in position x then I could check whether the pattern from x to n repeats at position x-(n-x) to x, but this seems like the kind of problem that could have a nice algorithm for it. How would you code this given the following function signature:
function findLoops(previousMoves, nextMove, maxPatternLength) {
//Return [loopLength, loopCount] or null if there are no loops
}
p.s. this is not a homework assignment, the game exists and is at http://www.idiot-cardgame.com if anyone is interested :)
First the general question: Your suggested method
trying to find the card I'm about to play and if I find that in position x then I could check whether the pattern from x to n repeats at position x-(n-x) to x,
looks really good. I would suggest basically the same. It is O(n) and needs a fixed amount of storage, and is simple: what else would you wish for?
Second: You can check for repetition in games generally if you keep a hash table of all previous game states (complete state, nothing left out). Everytime you reach a new state look up if it is in the hashtable, if its in it: you game state is looping.
In Javascript you have builtin hastables so this is very easy to do with something similar like this:
new_state = next_move(old_state);
new_encoded_state = encode(new_state); // make it into a string
if (allstates[new_encoded_state]) {
// we are looping!
} else {
allstates[new_encoded_state] = 1;
// no looping
}
The variable allstates is not an Array but of type Object. You can have array like access with strings and this uses the Object as hastable.