Memory-efficient js image management - javascript

Intro
I'd like to build a webpage using html, css, javascript (jQuery), and almost 1000 images.
The html page is very long, perhaps about 5000px.
When the user scrolls through the page, I would like the images to play like a flipbook as the background of the page. For instance if there are 1000 images and the user has scrolled 32% of the page, they will be looking at the 320th image. If they have scrolled the whole way, they are looking at the 1000th image.
How ought I go about handling this efficiently? The file size for each image is roughly 150kb and I would most importantly like to avoid compressing these images any further.
What I've Tried
Never removing images from memory
As the user scrolls more images are loaded, and previous images are set to display: none;. This approach works, but becomes very laggy as the user scrolls further and further.
Only keeping 40 images in memory at a time
As the user scrolls the current image is loaded, and 20 images ahead and behind the current image are loaded. Any images more than 20 flips away are "removed from memory". Still gets laggy (probably because I'm not handling memory properly - I'll get to that)
Question 1
What technique do you guys recommend I use on this project?
I feel as if my option 2 should be effective, but I'd love to hear more suggestions.
Question 2
How do I properly remove images not only from the DOM, but from memory?
I'm fairly certain if I want this project to be successful I will need to wind up removing images.
Current approach
Here's my current approach:
var imageNum = -The current image number that ought to be showing-;
var flip = -the container element holding all the images-;
flip.children().each(function(i, e) {
var jE = $(e);
var childInd = parseInt(jE.data('page-number'));
if (childInd < imageNum - 20 || childInd > imageNum + 20) {
jE.removeAttr('src');
jE.remove();
jE = null;
e = null;
}
});
This part in particular is where I am unsure I am correctly freeing memory:
jE.removeAttr('src');
jE.remove();
jE[0] = null;
e = null;
I put those lines together after a bit of googling. There are no event-handlers on the images, so I'm fairly sure that is not a concern.
How do I enable the image memory to be collected by the GC? Is what I'm doing sufficient to allow the GC to free the memory from these images?

You Can use image sprites , in which you can make a single image for multiple images and can fetch each image with the coordinates .

Related

How to prevent partially loaded images from displaying?

If you have, let's say, 3MB image in img tag, it will take a few seconds to load. When the image is loading, browser is sort of "printing" it - it shows the top part first, then middle and then bottom. How do I prevent this from happening?
I'd rather have the image hidden and after second or two shown - when it is fully loaded.
One way would be to give them a class that gives them opacity: 0 so they don't show:
<img src="/path/to/image" class="loading">
And in CSS:
.loading {
opacity: 0;
}
In head, we override that if JavaScript is disabled (so we're not unfriendly to non-JavaScript visitors):
<noscript>
<style>
.loading {
opacity: 1;
}
</style>
</noscript>
...and then in code at the bottom of your page, find all your images and remove the class when they've loaded, and...(see comments):
(function() {
// Get an array of the images
var images = Array.prototype.slice.call(document.querySelectorAll("img.loading"));
// Hook their load and error events, even though they may have already fired
images.forEach(function(image) {
image.addEventListener("load", imageDone.bind(null, image));
image.addEventListener("error", imageDone.bind(null, image)); // Could handle errors differently
});
// Check to see if any images are already complete
checkImages();
function imageDone(img) {
img.classList.loading("remove");
images = images.filter(function(entry) { entry != img });
}
function checkImages() {
images.forEach(function(image) {
if (image.complete) {
imageDone(image);
}
});
if (images.length) {
// Check back in a second
setTimeout(checkImages, 1000);
}
}
})();
That's a belt-and-braces approach. It proactively checks to see if images have finished loading, and also reactively handles the load and error event of images. In theory, we shouldn't need the setTimeout, and you might do testing without it, but...
Notice how once an image is complete, we remove the class so it's visible.
Old school:
To avoid the partial display of an image as it renders, save your large images as progressive, rather than baseline jpgs.
a progressive jpg renders as a series of scans of increasing quality
a baseline jpg renders top to bottom (what you described as “printing”).
The progressive option is considered more user friendly than both the sudden appearance of the image or the slow top to bottom rendering you dislike. The progressive file variant can even be smaller than its baseline counterpart.
For more about this read: The Return of the Progressive JPEG.
I think everyone here gave you some good answers and I just want to add in. 3MB is fairly big for a web image. Don't use something that large for an image being used for logo or layout. That's a larger amount of pixel data that you should only stick with if you are loading something that is a nice, large scale real-life image you want to preserve the quality to (or providing a download to a high-quality graphic of something). Besides the above, if you do a Google search, you find tons of solutions for loading images. Something nice I would use for larger images is a jQuery/ajax solution.

Skrollr Image Flicker. Firefox preload issue

Im in the process of developing a 'flipbook-style' animation using Skrollr by triggering background image changes when the user scrolls to indicated positions on the page. The issue i'm having is that in browser the image changes are delayed, creating what can only be defined as a 'flicker' of white between the frames.
<div class="section" style="background: url('frame1.png')"
data-560-top="background-image:!url('frame1.png');"
data-440-top="background-image:!url('frame2.png');">
The HTML is simple; it basically states that at 560 pixels from the top of the div (in relation to the browser window), the background should be at frame 1, then as the user scrolls closer to the div (440 pixels from the top of the div) the background image changes to frame 2. I plan to use up to around 20 frames and the images are quite large.
I have created a JSBin here which includes a very simplified sample with images from placehold.it. This includes the Skrollr script and an example layout of a section of my project. The key difference being that the images in my project are of much larger scale.
(function($) {
var cache = [];
// Arguments are image paths relative to the current page.
$.preLoadImages = function() {
var args_len = arguments.length;
for (var i = args_len; i--;) {
var cacheImage = document.createElement('img');
cacheImage.src = arguments[i];
cache.push(cacheImage);
}
};
})(jQuery);
jQuery.preLoadImages(
'http://www.placehold.it/300x200.png',
'http://www.placehold.it/300x200.png'
);
The above snippet seems to be working on Chrome, however the flicker issue remains in Firefox. Based on research, firefox handles cached images differently from Chrome? (e.g Where an image is not considered needed by firefox at a given time, it is trashed?)
I would like to know how I could possibly force all browsers to preload the images efficiently, to potentially avoid the background image flicker upon change. I am still quite new to Javascript/JQuery.
I hope I have provided a clear explanation. All assistance appreciated.
Dan
You can preload images using CSS only, no need for JS. Check out this article for more info. Another interesting way to do it is in the comment section of the article. Basically you assign the background image to a pseudo-element so that is is cached and ready to be used whenever. See this code for an example:
#something:before {
content: url("./img.jpg");
width:0;
height:0;
visibility:hidden;
}

Force mobile browser zoom out with Javascript

In my web app, I have some thumbnails that open a lightbox when clicked. On mobile, the thumbnails are small and the user typically zooms in. The problem is that when they click to play it, the lightbox is outside of the viewable area (they have to scroll to the lightbox to see the video). Is it possible to force a mobile browser to zoom out so they can see the whole page?
Making the page more responsive is not an option right now; it is a fairly large web application and it would take a huge amount of time to refactor.
Dug through a lot of other questions trying to get something to zoom out to fit the entire page. This question was the most relevant to my needs, but had no answers. I found this similar question which had a solution, although implemented differently, and not what I needed.
I came up with this, which seems to work in Android at least.
initial-scale=0.1: Zooms out really far. Should reveal your whole website (and then some)
width=1200: Overwrites initial-scale, sets the device width to 1200.
You'll want to change 1200 to be the width of your site. If your site is responsive then you can probably just use initial-scale=1. But if your site is responsive, you probably don't need this in the first place.
function zoomOutMobile() {
var viewport = document.querySelector('meta[name="viewport"]');
if ( viewport ) {
viewport.content = "initial-scale=0.1";
viewport.content = "width=1200";
}
}
zoomOutMobile();
Similar to Radley Sustaire's solution I managed to force unzoom whenever the device is turned in React with
zoomOutMobile = () => {
const viewport = document.querySelector('meta[name="viewport"]');
if ( viewport ) {
viewport.content = 'initial-scale=1';
viewport.content = 'width=device-width';
}
}
and inside my render
this.zoomOutMobile();
1 edge case I found was this did not work on the Firefox mobile browser
I ran in a similar problem, rather the opposite, I guess, but the solution is most certainly the same. In my case, I have a thumbnail that people click, that opens a "popup" where users are likely to zoom in to see better and once done I want to return to the normal page with a scale of 1.0.
To do that I looked around quite a bit until I understood what happens and could then write the correct code.
The viewport definition in the meta data is a live value. When changed, the system takes the new value in consideration and fixes the rendering accordingly. However, the "when changed" is detected by the GUI and while the JavaScript code is running, the GUI thread is mostly blocked...
With that in mind, it meant that doing something like this would fail:
viewport = jQuery("meta[name='viewport']");
original = viewport.attr("content");
force_scale = original + ", maximum-scale=1";
viewport.attr("content", force_scale); // IGNORED!
viewport.attr("content", original);
So, since the only way I found to fix the scale is to force it by making a change that I do not want to keep, I have to reset back to the original. But the intermediary changes are not viewed and act upon (great optimization!) so how do we resolve that issue? I used the setTimeout() function:
viewport = jQuery("meta[name='viewport']");
original = viewport.attr("content");
force_scale = original + ", maximum-scale=1";
viewport.attr("content", force_scale);
setTimeout(function()
{
viewport.attr("content", original);
}, 100);
Here I sleep 100ms before resetting the viewport back to what I consider normal. That way the viewport takes the maximum-scale=1 parameter in account, then it times out and removes that parameter. The scale was changed back to 1 in the process and restoring my original (which does not have a maximum-scale parameter) works as expected (i.e. I can scale the interface again.)
WARNING 1: If you have a maximum-scale parameter in your original, you probably want to replace it instead of just appending another value at the end like in my sample code. (i.e. force_scale = original.replace(/maximum-scale=[^,]+/, "maximum-scale=1") would do the replace--but that works only if there is already a maximum-scale, so you may first need to check to allow for either case.)
WARNING 2: I tried with 0ms instead of 100ms and it fails. This may differ from browser to browser, but the Mozilla family runs the immediately timed out timer code back to back, meaning that the GUI process would never get a chance to reset the scale back to 1 before executing the function to reset the viewport. Also I do know of a way to know that the current viewport values were worked on by the GUI... (i.e. this is a hack, unfortunately.)
This one works for me
let sw = window.innerWidth;
let bw = $('body').width();
let ratio = sw / bw - 0.01;
$('html').css('zoom', ratio);
$('html').css('overflow-x', 'hidden');
Its fits html to screen and prevents from scrolling. But this is not a good idea and work not everywhere.
var zoomreset = function() {
var viewport = document.querySelector("meta[name='viewport']");
viewport.content = "width=650, maximum-scale=0.635";
setTimeout(function() {
viewport.content = "width=650, maximum-scale=1";
}, 350);
}
setTimeout(zoomreset, 150);
replace 650 with the width of your page

Questions about adding 100+ Images from a Sprite and then garbaging them (long)

I'm creating a short game in Html5. I'm trying to figure out the best way to do the Hero selection.
Basically there are 113 heroes. I created a spritesheet that is 1320x1320 with each hero img being 120x120. The first picture is actually just a box that says 'Click to pick hero' in it.
My first question is, since it loads my spritesheet at the beginning to load the first image, later on when it loads the rest of the heroes it won't have to reload the image right? Because
setting 'heroPics[i].style.backgroundImage = "url(Heroes.jpg)";' each time makes me feel uneasy.
Second and important question to me. Back when I worked on games for mobiles, I found out that if you loaded an image that's 570 it'd use resources for a 1024x1024 and that it'd be better to remake the image to 512 and just scale it up, saving loads of resources. Is it the same here? My image being 1320 would it use resources as a 2048? Or since I'm loading images 120x120 it's only using resources for 128?
Now on to the real question. When the person clicks on 'Click to pick hero', I want all the hero images to appear. When they pick a hero I'd like to garbage all the variables and the div I just created, because they will not be picking a new hero too often, so it's better to garbage it, right? Or since the spritesheet is already loaded it's worth it to just hide the div containing the images instead? It'd still have all those variables loaded tho? Anyway that's one of my major question.
Second one is, how do I create a scrollbar inside a div dynamically? I believe I could do it if I set all the properties manually but I want to create tags and a search for the heroes, so the scrollbar has to adjust to whatever is currently being searched active, any advice on this one is greatly appreciated.
And last of all, is there a way to create the image at half it's size from the beginning? I tried .style.width = "50%" and height auto but it doesn't work since it's a spritesheet =(. So I use the webkit to scale down the div but I'd prefer another option if possible.
Thanks for reading this far and sorry for all the questions, here is what I've done so far:
function selectHero() {
var gg = 1;
var bg = 0;
for (var i = 1; i < 114; i++) {
heroPics[i] = new Image();
heroPics[i].style.backgroundImage = "url(newHeroes.jpg)";
heroPics[i].style.width = "120px";
heroPics[i].style.height = "120px";
heroPics[i].style.backgroundPosition = (-(120 * i)) + "px" + " " + (-((Math.floor(i / 11)) * 120)) + "px";
heroPics[i].style.position = "absolute";
heroPics[i].style.left = -90 + (75 * gg) + "px";
heroPics[i].style.top = -30 + (75 * bg) + "px";
heroPics[i].style.webkitTransform = 'scale(0.6, 0.6)';
heroPics[i].draggable = false;
someDiv.appendChild(heroPics[i]);
//heroPics[i].addEventListener( "click", heroChosen, false );
gg ++;
if(gg > 17) {
gg = 1;
bg ++;
}
}
}
I heard math.floor uses way too much resources, should I find a different solution even if it's uglier since right now it's calling math.floor 113 times? Thanks once again
Edit:
Found a solution to my last question about resizing images:
background-size = 792px 792px;
Just scaled 1320x1320 down by 60% in the css class and then changed the imgae size from 120 to 72 and it worked.
Also thanks for the useful tip of creating a class that holds the majority of the properties and using JS only when needed. Still need help with the scrollbar and a few others!
Basically there are 113 heroes. I created a spritesheet that is
1320x1320 with each hero img being 120x120. The first picture is
actually just a box that says 'Click to pick hero' in it. My first
question is, since it loads my spritesheet at the beginning to load
the first image, later on when it loads the rest of the heroes it
won't have to reload the image right? Because setting
'heroPics[i].style.backgroundImage = "url(Heroes.jpg)";' each time
makes me feel uneasy.
Yes, but you would probably be better off doing this via CSS.
Second and important question to me. Back when I worked on games for
mobiles, I found out that if you loaded an image that's 570 it'd use
resources for a 1024x1024 and that it'd be better to remake the image
to 512 and just scale it up, saving loads of resources. Is it the same
here? My image being 1320 would it use resources as a 2048? Or since
I'm loading images 120x120 it's only using resources for 128?
First I have heard of that, and it is likely to be browser dependent even if true. On second thought, I did hear that iOS had some issues with loading images that were beyond a certain size, but I'm not certain. The largest image I think I currently use is 1440x570 or so. I'd have to check the sprites, but most of them are much smaller.
Now on to the real question. When the person clicks on 'Click to pick
hero', I want all the hero images to appear. When they pick a hero I'd
like to garbage all the variables and the div I just created, because
they will not be picking a new hero too often, so it's better to
garbage it, right? Or since the spritesheet is already loaded it's
worth it to just hide the div containing the images instead? It'd
still have all those variables loaded tho? Anyway that's one of my
major question.
If you are doing filtering etc, you might try something like using classes on the children of your div. So you would have code like:
<div id="heroselection">
<div class="hero1 fighter male"></div>
<div class="hero2 wizard female"></div>
</div>
Then as you select filters, you can easily go through and hide the ones you don't need. First, hide them all. Then show the ones that match your filters, so if they checkbox "female" then your javascript (I'm using jQuery here, but feel free to pick another):
$('#heroselection > div').hide();
$('#hereselection > div.female').show();
Second one is, how do I create a scrollbar inside a div dynamically? I
believe I could do it if I set all the properties manually but I want
to create tags and a search for the heroes, so the scrollbar has to
adjust to whatever is currently being searched active, any advice on
this one is greatly appreciated.
Sounds like you want overflow:auto or perhaps overflow-y: auto on the div.
And last of all, is there a way to create the image at half it's size
from the beginning? I tried .style.width = "50%" and height auto but
it doesn't work since it's a spritesheet =(. So I use the webkit to
scale down the div but I'd prefer another option if possible.
Sounds like you are looking for background-size
you are creating too much properties using javascript better solution is to create one parent class with common properties and apply this class to all divs and modify remaining properties with Javascript.
#parent > div{
background:url('newHeroes.jpg');
width:120px;
height:120px;
}
If you are familiar with SASS style of writing CSS then you can write sass and compile to css for all child div elements
#for $i from 1 through 114 {
div:nth-child(#{$i}) {
/* example --width: 100% / #{$i}*/
}
}

iPad/iPhone browser crashing when loading images in Javascript

I'm trying to build an image gallery in Safari that mimics the iPad photo app. It works perfectly, except that once I load more than 6MB or so worth of images either by adding them to the DOM or creating new Image objects, new images either stop loading or the browser crashes. This problem is widespread enough (with everyone else hitting up against the same limit) that I've ruled out my Javascript code as the culprit.
Given that you can stream much more than a few MB in a element or through the in-browser media player, this limit seems unnecessary, and there should be some kind of workaround available. Perhaps by freeing up memory or something else.
I also came across this reference for UIWebView.
"JavaScript allocations are also limited to 10 MB. Safari raises an exception if you exceed this limit on the total memory allocation for JavaScript."
Which matches what I'm seeing fairly well. Is it possible to deallocate objects in Javascript, or does Safari/UIWebView keep a running total and never lets go? Alternately, is there any workaround to load in data another way that doesn't eat up this 10MB?
Update: I think there's an even easier way to do this, depending on your application. Instead of having multiple images, if you simply have one <img> element or Image object (or maybe two, like a 'this' image and a 'next' image if you need animations or transitions) and simply update the .src, .width, .height and so on, you should never get near the 10MB limit. If you wanted to do a carousel application, you'd have to use smaller placeholders first. You might find this technique might be easier to implement.
I think I may actually have found a work-around to this.
Basically, you'll need to do some deeper image management and explicitly shrink any image you don't need. You'd normally do this by using document.removeChild(divMyImageContainer) or $("myimagecontainer").empty() or what have you, but on Mobile Safari this does absolutely nothing; the browser simply never deallocates the memory.
Instead, you need to update the image itself so it takes up very little memory; and you can do that by changing the image's src attribute. The quickest way I know of to do that is to use a data URL. So instead of saying this:
myImage.src="/path/to/image.png"
...say this instead:
myImage.src="data:image/gif;base64,AN_ENCODED_IMAGE_DATA_STRING"
Below is a test to demonstrate it working. In my tests, my large 750KB image would eventually kill the browser and halt all JS exectution. But after resetting src, I"ve been able to load in instances of the image over 170 times. An explanation of how the code works is below as well.
var strImagePath = "http://path/to/your/gigantic/image.jpg";
var arrImages = [];
var imgActiveImage = null
var strNullImage = "data:image/gif;base64,R0lGODlhEAAOALMAAOazToeHh0tLS/7LZv/0jvb29t/f3//Ub//ge8WSLf/rhf/3kdbW1mxsbP//mf///yH5BAAAAAAALAAAAAAQAA4AAARe8L1Ekyky67QZ1hLnjM5UUde0ECwLJoExKcppV0aCcGCmTIHEIUEqjgaORCMxIC6e0CcguWw6aFjsVMkkIr7g77ZKPJjPZqIyd7sJAgVGoEGv2xsBxqNgYPj/gAwXEQA7";
var intTimesViewed = 1;
var divCounter = document.createElement('h1');
document.body.appendChild(divCounter);
var shrinkImages = function() {
var imgStoredImage;
for (var i = arrImages.length - 1; i >= 0; i--) {
imgStoredImage = arrImages[i];
if (imgStoredImage !== imgActiveImage) {
imgStoredImage.src = strNullImage;
}
}
};
var waitAndReload = function() {
this.onload = null;
setTimeout(loadNextImage,2500);
};
var loadNextImage = function() {
var imgImage = new Image();
imgImage.onload = waitAndReload;
document.body.appendChild(imgImage);
imgImage.src = strImagePath + "?" + (Math.random() * 9007199254740992);
imgActiveImage = imgImage;
shrinkImages()
arrImages.push(imgImage);
divCounter.innerHTML = intTimesViewed++;
};
loadNextImage()
This code was written to test my solution, so you'll have to figure out how to apply it to your own code. The code comes in three parts, which I will explain below, but the only really important part is imgStoredImage.src = strNullImage;
loadNextImage() simply loads a new image and calls shrinkImages(). It also assigns an onload event which is used to begin the process of loading another image (bug: I should be clearing this event later, but I'm not).
waitAndReload() is only here to allow the image time to show up on the screen. Mobile Safari is pretty slow and displaying big images, so it needs time after the image has loaded to paint the screen.
shrinkImages() goes through all previously loaded images (except the active one) and changes the .src to the dataurl address.
I'm using a file-folder image for the dataurl here (it was the first dataurl image I could find). I'm using it simply so you can see the script working. You'll probably want to use a transparent gif instead, so use this data url string instead: data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==
The 6.5MB(iPad) / 10MB(iPhone) download limits are calculated based on the number of image elements used to set an image through its src property. Mobile safari doesn't seem to differentiate images loaded from cache or via the network. It also doesn't matter whether the image is injected into the dom or not.
The second part to the solution is that mobile safari seems to be able to load an unlimited number of images via the "background-image" css property.
This proof of concept uses a pool of precacher's which set the background-image properties once successfully downloaded. I know that it's not optimal and doesn't return the used Image downloader to the pool but i'm sure you get the idea :)
The idea is adapted from Rob Laplaca's original canvas workaround http://roblaplaca.com/blog/2010/05/05/ipad-safari-image-limit-workaround/
<!DOCTYPE html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
<title>iPad maximum number of images test</title>
<script type="text/javascript">
var precache = [
new Image(),
new Image(),
new Image(),
new Image()
];
function setImage(precache, item, waiting) {
precache.onload = function () {
item.img.style.backgroundImage = 'url(' + item.url + ')';
if (waiting.length > 0) {
setImage(precache, waiting.shift(), waiting);
}
};
precache.src = item.url;
}
window.onload = function () {
var total = 50,
url = 'http://www.roblaplaca.com/examples/ipadImageLoading/1500.jpg',
queue = [],
versionUrl,
imageSize = 0.5,
mb,
img;
for (var i = 0; i < total; i++) {
mb = document.createElement('div');
mb.innerHTML = ((i + 1) * imageSize) + 'mb';
mb.style.fontSize = '2em';
mb.style.fontWeight = 'bold';
img = new Image();
img.width = 1000;
img.height = 730;
img.style.width = '1000px';
img.style.height = '730px';
img.style.display = 'block';
document.body.appendChild(mb);
document.body.appendChild(img);
queue.push({
img: img,
url: url + '?ver=' + (i + +new Date())
});
}
//
for (var p = 0; p < precache.length; p++) {
if (queue.length > 0) {
setImage(precache[p], queue.shift(), queue);
}
}
};
</script>
</head>
<body>
<p>Loading (roughly half MB) images with the <strong>img tag</strong></p>
</body>
</html>
So far I've had luck using <div> tags instead of <img> tags and setting the image as the div's background-image.
All in all, it's crazy. If the user is making an affirmative request for more image content, then there's no reason why Safari shouldn't allow you to load it.
I've had luck starting with the suggestion of Steve Simitzis, and Andrew.
My project:
PhoneGap-based app with 6 main sections, and about 45 subsections which have a jquery cycle gallery of between 2 and 7 images, each 640 x 440 (215+ images altogether). At first I was using ajax to load page fragments, but I've since switched to a one-page site, with all sections hidden until needed.
Initially, after going through about 20 galleries, I was getting memory warning 1, then 2, then the crash.
After making all the images into divs with the image applied as a background, I could get through more galleries (about 35) in the app before a crash, but after going to previously visited galleries, it would eventually fail.
The solution that seems to be working for me, is to store the background image URL in the div's title attribute, and setting all of the background images to be a blank gif. With 215+ images, I wanted to keep the url someplace in the html for sake of ease and quick reference.
When a subnavigation button is pressed, I rewrite the css background image to the correct source which is contained in the div's title tag, for ONLY the gallery that is showing. This saved me from having to do any fancy javascript to store the correct source image.
var newUrl = $(this).attr('title');
$(this).css('background-image', 'url('+newUrl+')');
When a new subnavigation button is pressed, I rewrite the background image of the last gallery divs to be blank gifs. So, aside from interface gfx, I only have 2-7 images 'active' at all times. With anything else I add that contains images, I just use this "ondemand" technique to swap the title with the background-image.
Now it seems I can use the app indefinitely with no crashes. Don't know if this will help anyone else, and it may not be the most elegant solution, but it provided a fix for me.
On a rails app, I was lazy loading hundreds of mid-size photos (infinite scroll) and inevitably hit the 10Mb limit on the iphone. I tried loading the graphics into a canvas (new Image, src=, then Image.onload) but still hit the same limit. I also tried replacing the img src and removing it (when it went out of viewable area) but still no cigar. In the end, switching out all the img tags w/ div's w/ the photo as background did the trick.
$.ajax({
url:"/listings/"+id+"/big",
async:true,
cache:true,
success:function(data, textStatus, XMLHttpRequest) {
// detect iOS
if (navigator.userAgent.match(/iPhone/i) || navigator.userAgent.match(/iPod/i) || navigator.userAgent.match(/iPad/i)) {
// load html into data
data = $(data);
// replace img w/ div w/ css bg
data.find(".images img").each(function() {
var src = $(this).attr("src").replace(/\s/g,"%20");
var div = $("<div>");
div.css({width:"432px",height:"288px",background:"transparent url("+src+") no-repeat"});
$(this).parent().append(div);
$(this).remove();
});
// remove graphic w/ dynamic dimensions
data.find(".logo").remove();
}
// append element to the page
page.append(data);
}
});
I can now load well over 40Mb of photos on one page w/o hitting the wall. I encountered an odd issue, though, with some of the css background graphics failing to show up. A quick js thread fixed that. Set the div's css bg property every 3 sec's.
setInterval(function() {
$(".big_box .images div.img").each(function() {
$(this).css({background:$(this).css("background")});
});
}, 3000);
You can see this in action at http://fotodeck.com. Check it out on your iphone/ipad.
I was unable to find a solution for this. Here are a couple of methods I tried, and all of them failed:
Simply changed the background of a DIV using div.style.backgroundImage = "url("+base64+")"
Changed the .src of an image using img.src = base64
Removed the old and added the new image using removeChild( document.getElementById("img") ); document.body.appendChild( newImg )
The same as above but with a random height on the new image
Removing and adding the image as a HTML5 canvas object. Also doesn't work, since a new Image(); has to be created, see *
On launch, created a new Image() object, let's call it container. Displayed the image as <canvas>, every time the image changed, I would change container's .src and redraw the canvas using ctx.drawImage( container, 0,0 ).
The sames as the previous, but without actually redrawing the canvas. Simply changing the Image() object's src uses up memory.
A strange thing I noticed: The bug occurs even if the image isn't displayed! For example, when doing this:
var newImg = new Image( 1024, 750 );
newImg.src = newString; // A long base64 string
Every 5 seconds, and nothing else, no loading or displaying the image, of course wrapped up in an object, also crashes the memory after some time!
I encountered an out of memory with Javascript on the iPad when we were trying to refresh an image very often, like every couple of seconds. It was a bug to refresh that often, but Safari crashed out to the home screen. Once I got the refresh timing under control, the web app functioned fine. It seemed as if the Javascript engine couldn't keep up with garbage collection quickly enough to discard all the old images.
There are issues with memory and the way to solve this problem is very simple. 1) Put all your thumbnails in canvas. You will be creating a lot of new Image objects and drawing them into canvas, but if your thumbnail are very small you should be fine. For the container where you will be displaying the real size image, create only one Image object and reuse this object and make sure to also draw it into a canvas. So, every time a user clicks the thumbnail, you will update your main Image object. Do not insert IMG tags in the page. Insert CANVAS tags instead with the correct width and height of the thumbnails and the main display container. iPad will cry foul if you insert too many IMG tags. So, avoid them!!! Insert only canvas. You can then find the canvas object from the page and get the context. So every time the user clicks a thumbnail, you will get the src of the main image (real size image) and draw it to the main canvas, reusing the main Image object and the firing the events. Clearing the events every time at the beginning.
mainDisplayImage.onload = null;
mainDisplayImage.onerror = null;
...
mainDisplayImage.onload = function() { ... Draw it to main canvas }
mainDisplayImage.onerror = function() { ... Draw the error.gif to main canvas }
mainDisplayImage.src = imgsrc_string_url;
I have create 200 thumbnails and each is like 15kb. The real images are like 1 MB each.
I also had similar problems while rendering large lists of images on iPhones.
In my case displaying even 50 images in the list was enough to either crash the browser or occasionally the entire operating system. For some reason any images rendered onto the page weren't garbage collected, even when pooling and recycling just a few onscreen DOM elements or using the images as background-image property. Even displaying the images directly as Data-URIs is enough to count towards the limit.
The solution ended up being rather simple - using position: absolute on the list items allows them to be garbage collected fast enough to not run into a memory limit. This still involved on having only about 20-30 images in the DOM at any moment, creating and removing the item's DOM nodes by scroll positon finally did the trick.
It seems it's particularily dependent on having webkit-transform':'scale3d() applied to any ancestor of the images in the DOM. Relatively flowing a very tall DOM and rendering it on the GPU pisses off a memory leak in webkit renderer, I guess?
I'm running in a similar issue in Chrome too, developing an extension that loads images in the same page (the popup, actually) replacing old images with new ones.
The memory used by the old images (removed from the DOM) is never freed, consuming all the PC memory in a short time.
Have tried various tricks with CSS, without success.
Using hardware with less memory than a PC, like the iPad, this problem arises earlier, naturally.
I filed a bug with jQuery as jQuery trys to handle memory leaks...so I'd consider this a bug. Hopefully the team can come up with some concise and clever way of handling this problem in Mobile Safari soon.
http://dev.jquery.com/ticket/6944#preview

Categories

Resources