Unloading Resources on HTML with JavaScript - javascript

I'm working on a HTML 5 game, it is already online, but it's currently small and everything is okay.
Thing is, as it grows, it's going to be loading many, many images, music, sound effects and more. After 15 minutes of playing the game, at least 100 different resources might have been loaded already. Since it's an HTML5 App, it never refreshes the page during the game, so they all stack in the background.
I've noticed that every resource I load - on WebKit at least, using the Web Inspector - remains there once I remove the <img>, the <link> to the CSS and else. I'm guessing it's still in memory, just not being used, right?
This would end up consuming a lot of RAM eventually, and lead to a downgrade in performance specially on iOS and Android mobiles (which I slightly notice already on the current version), whose resources are more limited than desktop computers.
My question is: Is it possible to fully unload a Resource, freeing space in the RAM, through JavaScript? Without having to refresh the whole page to "clean it".
Worst scenario: Would using frames help, by deleting a frame, to free those frames' resources?.
Thank you!

Your description implies you have fully removed all references to the resources. The behavior you are seeing, then, is simply the garbage collector not having been invoked to clean the space, which is common in javascript implementations until "necessary". Setting to null or calling delete will usually do no better.
As a common case, you can typically call CollectGarbage() during scene loads/unloads to force the collection process. This is typically the best solution when the data will be loaded for game "stages", as that is a time that is not time critical. You usually do not want the collector to invoke during gameplay unless it is not a very real-time game.
Frames are usually a difficult solution if you want to keep certain resources around for common game controls. You need to consider whether you are refreshing entire resources or just certain resources.

All you can do is rely on JavaScript's built in garbage collection mechanism.
This kicks in whenever there is no reference to your image.
So assuming you have a reference pointer for each image, if you use:
img.destroy()
or
img.parentNode.removeChild(img)
Worth checking out: http://www.ibm.com/developerworks/web/library/wa-memleak/
Also: Need help using this function to destroy an item on canvas using javascript
EDIT
Here is some code that allows you to load an image into a var.
<script language = "JavaScript">
var heavyImage = new Image();
heavyImage.src = "heavyimagefile.jpg";
......
heavyImage = null; // removes reference and frees up memory
</script>
This is better that using JQuery .load() becuase it gives you more control over image references, and they will be removed from memory if the reference is gone (null)
Taken from: http://www.techrepublic.com/article/preloading-and-the-javascript-image-object/5214317
Hope it helps!

There are 2 better ways to load images besides a normal <img> tag, which Google brilliantly discusses here:
http://www.youtube.com/watch?v=7pCh62wr6m0&list=UU_x5XG1OV2P6uZZ5FSM9Ttw&index=74
Loading the images in through an HTML5 <canvas> which is way way faster. I would really watch that video and implement these methods for more speed. I would imagine garbage collection with canvas would function better because it's breaking away from the DOM.
Embedded data urls, where the src attribute of an image tag is the actual binary data of the image (yeah it's a giant string). It starts like this: src="data:image/jpeg;base64,/9j/MASSIVE-STRING ... " After using this, you would of course want to use a method to remove this node as discussed in the other answers. (I don't know how to generate this base64 string, try Google or the video)

You said Worst scenario: Would using frames help, by deleting a frame, to free those frames' resources
It is good to use frame. Yes, it can free up resource by deleting the frames.

All right, so I've made my tests by loading 3 different HTML into an < article > tag. Each HTML had many, huge images. Somewhat about 15 huge images per "page".
So I used jQuery.load() function to insert each in the tag. Also had an extra HTML that only had an < h1 >, to see what happened when a page with no images was replacing the previous page.
Well, turns out the RAM goes bigger while you start scrolling, and shoots up when going through a particularly big image (big as in dimensions and size, not just size). But once you leave that behind and lighter images come to screen, the RAM consumption actually goes down. And whenever I replaced using JS the content of the page, the RAM consumption went really down when it was occupying to much. Virtual Memory remained always high and rarely went down.
So I guess the browser is quite smart about handling Resources. It does not seem to unload them if you leave it there for a long while, but as soon you start loading other pages or scrolling, it starts loading / freeing up.
I guess I don't have anything to worry about after all...
Thanks everyone! =)

Related

JS: Prevent images loading lazy after setting to display:block

On my website I need to toggle the visibility of complete articles.
When I make them visible (display:block), the text appears very fast while the space, where the image should be, is white. After a half or a second the image appears from once (it was loaded from the server before, so that's not the prob xD).
Now maybe there is a solution where I can hold the image in the RAM.
I don't even know how to call the Problem so I couldn't found much on google.
(Its important to take the article out of the DOM-Tree so setting opacity or visibility to 0 is not a solution).
It's up to the browser how it stores and fetches images from cache. There are a lot of factors, including what else the browser is doing, how many images, how big they are, etc. If it's taking that long, it's possible they are getting forced out of cache or they are too big or some other problem. Have you checked to make sure they are indeed being cached (again, this may be somewhat browser dependent)? Also make sure you don't have caching disabled (in your dev console or similar).
There are a lot of potential options to manage the image data, really depends on what you are doing as to the best solution.
This SO answer explains it clearly. In short,
If you render the HTML on the page, even if it's hidden, it's going to
load. If you want images to load only when they're needed, you're
going to have to dynamically set the source (src) on the image tag in
javascript.

Slow javascript execution in Iframe only in IE

The Problem:
I've developed a web application. It is embedded in a site with the help of an iFrame.
If I run the application as a stand alone (IE9) on say: www.example.com/webapp it loads in about ten seconds flat (it's a rather large application). Chrome and FF are much faster.
If It's embedded in an iFrame however, IE completely loses it with javascript execution times up to 40-60 seconds until the app is done loading. Once the application is loaded however there are no issues and it runs flawlessly.
Recap: Stand alone: OK, in iFrame: Not OK.
In the web application a few xml's are loaded, specifically a very large one which is about 8mb. The xml's are parsed and content is created using KnockoutJS. However this is not very relevant as I've narrowed it down to the XML parsing which is done with jQuery.
Stand alone the parsing takes about 10 seconds in IE9. Embedded it's around 40-60. I've consoled out the status logs and timestamps and I can physically see the javascript is running incredibly slow embedded. Every trace-out takes 4-6 times as long which corresponds with the increased overall load time.
FireFox and Chrome are immune and show no slowdown or so little slowdown that it's unnoticeable.
I've tried iFrame and Object embedding. Same results.
The question
Do you know why simple javascript execution (XML Parsing when the xml IS loaded and in memory), would take 4-6 times longer when embedded in an iframe than in stand alone?
Bonus info
I'm not talking about page load here. Everything loads fine. Even the host page. This is not yet another page is hanging until iframe is ready problem. the problem is the execution inside the iframe being slow. I've tried embedding on same domain, foreign domain, internal, external. Same problem everywhere. As soon as I iframe the damn thing, load performance goes to hell. Once it's loaded, everything is fine and everything runs very well.
PS: I hope the bolding of what i find is keywords is OK. It's supposed to be a help, not be annoying. I personally have problems focusing on large amounts of text.
**
Performance Monitor while it's loading:
IE9**
http://imgur.com/iYdMuPe
I found that setting element size with jQuery .height(n) and .width(n) can be extremly slow, you may use .css("width",x) and .css("height",x) instead.
First, hit F-12 and confirm the document mode is the same in both instances. If not, change the document mode of the outer frame to match..
If they are already the same, try instead to load the iFrame script dynamically after the outer page is complete. Older versions of IE handle resource allocation oddly and could be part of the problem.
Granted, not the answer to your question but bringing 8 MB of XML to the client is quite inefficient. Can any of this be stripped out or entirely processed server side?
Lastly, IE is slow to move and add DOM elements (compared to Chrome). Your best bet is to add them all at once. So if you are updating the UI as you parse the XML (instead of all at once after parsing), that will slow you down considerably.
Similar to what #ern0 said, if you are manipulating height and width in your script and are experiencing slowness then changing from using jQuery's .height() and .width() methods to vanilla JS could realize a significant performance improvement.
Getters
Here is a performance test for reading the element's current height. It shows that the vanilla JS property offsetHeight is significantly faster than the .height(), .css("height") and .style.height techniques.
The difference is so significant that it is not even a competition.
Setters
Here is a performance test for setting the element's current height. It shows that the vanilla JS property .style.height is significantly faster* than the .height(), and .css("height") methods.
Again, the difference is so significant that it is not even a competition.
Summary
The .style.height property excels in both getting and setting by an incredible margin, as compared to the jQuery methods. The read-only offsetHeight property is significantly faster than the style.height property for getting, but (as it is read-only) it cannot be used for setting the height. As such, it may be easier to just change the code to use .style.height, if it still achieves the desired effect.
The height and width properties and methods should be pretty much the same. If you want to add performance benchmarks for them too, that is fine, but you should get the same outcome, with the width properties and methods finishing in the same place as their corresponding height counterparts.
Apparently IE had a serious problem with getting attributes of an xml node through jQuery in a deeply nested loop. Changing this to pure JS reduced load time to about 15 seconds. Still not great, but much, much better!

How to free used memory after loading differents page using AJAX? [duplicate]

I have a very basic ajax slideshow on my website. On every scroll, the new images and response content continually increase the amount of memory used by the browser.
I've done my research and tried all suggestions to reset the XHR object on each new request, but this does absolutely nothing to help.
The slideshows are basic but may contain hundreds of slides. I want a user to be able to navigate the slideshow indefinitely without crashing their browser. Is this even possible?
Thanks, Brian
Increasing memory usage is normal. You are, after all, loading more data each time - the HTML from your AJAX response, as well as the images that are being displayed. Unless you're using Adobe Pagemill-generated HTML, that's only going to be a few hundreds of bytes of HTML/text. It's the images that will suck up the most space. Everything get stuffed into the browser's cache.
Since you're not doing anything fancy with the DOM (building sub-trees and whatnot) directly, just replacing a chunk of HTML repetitively, eventually the browser will do a cleanup and chuck some of the unused/old/stale image data from memory/cache and reclaim some of that memory.
Now, if you were doing some highly complex DOM manipulations and generating lots of new nodes on the fly, and were leaking some nodes here and there, THEN you'd have a memory problem, as those leaked nodes will eventually bury the browser.
But, just increasing memory usage by loading images is nothing to worry about, it's just like a normal extended surfing session, except you're just loading some new pictures.
If its a slideshow, are you only showing one image at a time? If you do only show one at a time and you're never getting rid of the last one you show, it will always increase the memory. If you remove the slides not being shown, it should help.

html page size problem with no of dom elements increase

Recently we redesigned one of our pages and suddenly page has been increased from 1MB to 1.98MB.
I compared the no of DOM elements and its increased from 1600 to 2300. I found the no of elements from the below command
document.getElementsByTagName('*').length
We did a load test and found the load time also increased from 1.1 to 2 seconds. Is this the reason for all problems.
I think the above line won't consider any inline css and js right , as they are not DOM elements.
Can you please suggest
Without knowing exactly what you redesigned, it's impossible to know what change caused the increase. But even a 1MB page is pretty large. JavaScript (and particularly jQuery) can change the number of DOM objects... consider this:
$('p').append('<span>Blah</span> <span>blah</span> <span>blah</span>');
That will add 3 DOM objects for each p tag on the page (which could be a lot!) and yet it adds only 71 bytes to your page. jQuery can similarly remove DOM objects. So I don't think the number of DOM objects is really much of a consideration.
The javascript that runs can manipulate the dom and create new nodes which would affect your count. However it shouldn't make the page load any slower as it's rendered on the client side.
I think you need to include more information if you expect to get a better answer.
Also you should look into browser plugins (for firefox) like Yslow, or firebug (net tab) that show you all the files being loaded and how long they load.
Anytime that you have more information crossing the wire, it will take longer. Therefore, with more DOM elements in the page, the loading time will be slower. I hope this answers your question because I'm not really sure of what you are actually asking.

How to free memory after an Ajax request

I have a very basic ajax slideshow on my website. On every scroll, the new images and response content continually increase the amount of memory used by the browser.
I've done my research and tried all suggestions to reset the XHR object on each new request, but this does absolutely nothing to help.
The slideshows are basic but may contain hundreds of slides. I want a user to be able to navigate the slideshow indefinitely without crashing their browser. Is this even possible?
Thanks, Brian
Increasing memory usage is normal. You are, after all, loading more data each time - the HTML from your AJAX response, as well as the images that are being displayed. Unless you're using Adobe Pagemill-generated HTML, that's only going to be a few hundreds of bytes of HTML/text. It's the images that will suck up the most space. Everything get stuffed into the browser's cache.
Since you're not doing anything fancy with the DOM (building sub-trees and whatnot) directly, just replacing a chunk of HTML repetitively, eventually the browser will do a cleanup and chuck some of the unused/old/stale image data from memory/cache and reclaim some of that memory.
Now, if you were doing some highly complex DOM manipulations and generating lots of new nodes on the fly, and were leaking some nodes here and there, THEN you'd have a memory problem, as those leaked nodes will eventually bury the browser.
But, just increasing memory usage by loading images is nothing to worry about, it's just like a normal extended surfing session, except you're just loading some new pictures.
If its a slideshow, are you only showing one image at a time? If you do only show one at a time and you're never getting rid of the last one you show, it will always increase the memory. If you remove the slides not being shown, it should help.

Categories

Resources