If you have, let's say, 3MB image in img tag, it will take a few seconds to load. When the image is loading, browser is sort of "printing" it - it shows the top part first, then middle and then bottom. How do I prevent this from happening?
I'd rather have the image hidden and after second or two shown - when it is fully loaded.
One way would be to give them a class that gives them opacity: 0 so they don't show:
<img src="/path/to/image" class="loading">
And in CSS:
.loading {
opacity: 0;
}
In head, we override that if JavaScript is disabled (so we're not unfriendly to non-JavaScript visitors):
<noscript>
<style>
.loading {
opacity: 1;
}
</style>
</noscript>
...and then in code at the bottom of your page, find all your images and remove the class when they've loaded, and...(see comments):
(function() {
// Get an array of the images
var images = Array.prototype.slice.call(document.querySelectorAll("img.loading"));
// Hook their load and error events, even though they may have already fired
images.forEach(function(image) {
image.addEventListener("load", imageDone.bind(null, image));
image.addEventListener("error", imageDone.bind(null, image)); // Could handle errors differently
});
// Check to see if any images are already complete
checkImages();
function imageDone(img) {
img.classList.loading("remove");
images = images.filter(function(entry) { entry != img });
}
function checkImages() {
images.forEach(function(image) {
if (image.complete) {
imageDone(image);
}
});
if (images.length) {
// Check back in a second
setTimeout(checkImages, 1000);
}
}
})();
That's a belt-and-braces approach. It proactively checks to see if images have finished loading, and also reactively handles the load and error event of images. In theory, we shouldn't need the setTimeout, and you might do testing without it, but...
Notice how once an image is complete, we remove the class so it's visible.
Old school:
To avoid the partial display of an image as it renders, save your large images as progressive, rather than baseline jpgs.
a progressive jpg renders as a series of scans of increasing quality
a baseline jpg renders top to bottom (what you described as “printing”).
The progressive option is considered more user friendly than both the sudden appearance of the image or the slow top to bottom rendering you dislike. The progressive file variant can even be smaller than its baseline counterpart.
For more about this read: The Return of the Progressive JPEG.
I think everyone here gave you some good answers and I just want to add in. 3MB is fairly big for a web image. Don't use something that large for an image being used for logo or layout. That's a larger amount of pixel data that you should only stick with if you are loading something that is a nice, large scale real-life image you want to preserve the quality to (or providing a download to a high-quality graphic of something). Besides the above, if you do a Google search, you find tons of solutions for loading images. Something nice I would use for larger images is a jQuery/ajax solution.
Related
I searched so many questions in SO about similar things. But none seems to work for my situation.
I have several diffent background images. For now i created gradient based on the background image manually and set it default. And loading the background image over the default gradient after 3 seconds.
But i dont want this kind of approach. I want to display images only if the user has good connection. Or else dont display the image at all, let the gradient of that background image be there.
Code example: CSS
.test1 {
background-image: -webkit-gradient(linear, 0 0, 86 148, color-stop(0.011, #898568));
background-image: -webkit-linear-gradient(300.1600608380871deg, #898568 1.1%);
background-image: -moz-linear-gradient(300.1600608380871deg, #898568 1.1%);
background-image: -o-linear-gradient(300.1600608380871deg, #898568 1.1%);
background-image: linear-gradient(149.83993916191292deg, #898568 1.1%)
}
.test1-img {
background-image: url("img1.jpg");
}
Code Sample: JS
setTimeout(function() {
$('#test1-div').toggleClass('test1-img');
}, 3000);
So Is it possible to do this in jQuery or JS? or any plugin for this?
PS: I dont want lazy loading algorithm. Cause it loads images when user comes into viewport even if they have low internet connection. For me i dont want to display background image at all for low speed connection.
PS2: I am new JS, so with working solutions with explanation is very good for me to understand.
If you're OK with attempting to load a single image, you could try something like the following:
window.speedyConnection = false;
var image = document.createElement("img");
image.onload = function () {
window.speedyConnection = true;
//Load your images here
}
image.src = "img1.jpg";
setTimeout(function () {
if (!window.speedyConnection) {
image.remove();
}
}, 750);
The idea being that you attempt to load the first image, but if it doesn't load in 750ms (or whatever timeout you prefer) you assume the user has a bad connection and stop loading it. If it does load, you go ahead and load the rest of the images.
This would, however, require at least sending a request for an image and receiving some data (even on slow connections), but it would only be for a single image and you would cancel it after the timeout.
try navigator.connection.effectiveType, you'll pretty much get what you want. it's going to show '2g', '3g', '4g' or '5g'. check browser compat too, see if you're okay with it
https://developer.mozilla.org/en-US/docs/Web/API/NetworkInformation
Im in the process of developing a 'flipbook-style' animation using Skrollr by triggering background image changes when the user scrolls to indicated positions on the page. The issue i'm having is that in browser the image changes are delayed, creating what can only be defined as a 'flicker' of white between the frames.
<div class="section" style="background: url('frame1.png')"
data-560-top="background-image:!url('frame1.png');"
data-440-top="background-image:!url('frame2.png');">
The HTML is simple; it basically states that at 560 pixels from the top of the div (in relation to the browser window), the background should be at frame 1, then as the user scrolls closer to the div (440 pixels from the top of the div) the background image changes to frame 2. I plan to use up to around 20 frames and the images are quite large.
I have created a JSBin here which includes a very simplified sample with images from placehold.it. This includes the Skrollr script and an example layout of a section of my project. The key difference being that the images in my project are of much larger scale.
(function($) {
var cache = [];
// Arguments are image paths relative to the current page.
$.preLoadImages = function() {
var args_len = arguments.length;
for (var i = args_len; i--;) {
var cacheImage = document.createElement('img');
cacheImage.src = arguments[i];
cache.push(cacheImage);
}
};
})(jQuery);
jQuery.preLoadImages(
'http://www.placehold.it/300x200.png',
'http://www.placehold.it/300x200.png'
);
The above snippet seems to be working on Chrome, however the flicker issue remains in Firefox. Based on research, firefox handles cached images differently from Chrome? (e.g Where an image is not considered needed by firefox at a given time, it is trashed?)
I would like to know how I could possibly force all browsers to preload the images efficiently, to potentially avoid the background image flicker upon change. I am still quite new to Javascript/JQuery.
I hope I have provided a clear explanation. All assistance appreciated.
Dan
You can preload images using CSS only, no need for JS. Check out this article for more info. Another interesting way to do it is in the comment section of the article. Basically you assign the background image to a pseudo-element so that is is cached and ready to be used whenever. See this code for an example:
#something:before {
content: url("./img.jpg");
width:0;
height:0;
visibility:hidden;
}
I am a skilled database / application programmer for the PC. I am also an ignorant html / javascript / web programmer.
I am creating some documentation about some .Net assemblies for our intranet. Ideally I would like to display an image full size if the browser window can fit it. If not then I would like to reduce it and toggle between a small version and full size version by a click. It is a dependency chart and can be different sizes for different pages. I would prefer a single function to handle this but being it is for our use none of the requirements I mentioned is set in stone. I would like to make it work well but nothing is mandatory.
I read a lot of stuff but couldn't find anything that matched what I wanted. First I tried this (after a few iterations):
<img src='Dependancy Charts/RotairAORFQ.png' width='100%' onclick='this.src="Dependancy Charts/RotairAORFQ.png";this.width=this.naturalWidth;this.height=this.naturalHeight;' ondblclick='this.src="Dependancy Charts/RotairAORFQ.png";this.width="100%";'>
It has problems. First off it enlarges a small image and it looks funny. Second I would have to put the code in every page. Third it requires a double click to restore it. I was going to live with those short commings but the double click fails. I can't figure out how to restore it.
So I tried to get fancy. I couldn't figure out how to get past problem 1, but solved 2 and 3 by creating a function in a separate file. Then I ran into what appeared to be the same problem. This was my second attempt:
function ImageToggle(Image)
{
if (ImageToggle.FullSize == 'undefined')
ImageToggle.FullSize = false;
if (ImageToggle.FullSize)
{
Image.width='100%';
ImageToggle.FullSize = false;
}
else
{
Image.width=Image.naturalWidth;
ImageToggle.FullSize = true;
}
return 0
}
And in my page:
<img src='Dependancy Charts/RotairAORFQ.png' width='100%' onclick='ImageToggle(this)'>
Can what I want be done? It doesn't sound impossible. If it is a large amount of effort would be required then alternate suggestions are acceptable.
You're probably interested in the max-width: 100% CSS property, rather than a flat-out width:100%. If you have a tiny image, it'll stay tiny. If you have a huge image, it gets resized to the width of the containing element.
For example: http://jsbin.com/kabepo/1/edit uses a small and a huge image, both with max-width:100%. As you can see, the small image is untouched, the huge image is resized to something sensible.
I would recommend that you set a the max-width: 100% CSS property for the image.
This will prevent the image's width from expanding to be greater than the container's width.
You can also do the same with max-height: 100% if you are having problems with the image overflowing vertically.
Please see this JSFiddle for an example.
(Note: If you set both a width and a height attribute on the <img> tag directly or in your CSS file your image will not be scaled proportionally.)
Does it have to be a toggle or would a mouseover work for you as well?
<style>
.FullSize { width:100px; height:auto; }
.FullSize:hover { width:90%; height:auto; }
</style>
<img src="Dependancy Charts/RotairAORFQ.png" class="FullSize">
Note: when image is made larger IN the page - the surrounding content will be displaced around it - depending on how you have set up the layout.
Also if you have any body margins or table or div paddings, using image width at 100% will make the page scroll. To check just change 90% to 100% and work your way up / down.
You could also force the image to be a specific size until the browser gets made smaller by the user / has a smaller resolution.
<style>
.FullSize {width:1000px;max-width:100%;height:auto;}
</style>
<img src="Dependancy Charts/RotairAORFQ.png" class="FullSize">
A tip: the image used must be the largest one. So minimum width of lets say 1200 pixels wide (if that is the forced image size you use). That way regardless of size it is it will remain clearer than a small image becoming a large. Since it's an intranet, file size shouldn't be an issue.
Thanks all for your help. Rob and Mike both pointed me to an excellent solution. I now have my page load with an image that fits the browser window, resizes with the browser and if the user is interested they can expand the image and scrollbars appear if necessary. I got this to work in a function so minimal code is needed for each page.
To load the image:
<p style="overflow:auto;">
<img src='Dependancy Charts/RotairAORFQ.png' width="100%" onclick='ImageToggle(this)'>
</p>
And the function:
function ImageToggle(Image)
{
if (ImageToggle.FullSize == 'undefined')
ImageToggle.FullSize = false;
if (ImageToggle.FullSize)
{
Image.style="max-width: 100%";
ImageToggle.FullSize = false;
}
else
{
Image.style="max-width: none";
Image.width=Image.naturalWidth;
ImageToggle.FullSize = true;
}
return 1
}
if you want to get current browser window size and if you want to do it on a click event so try this in jquery or javascript:
<script>
$("#myButton").click(function(){
var x = window.innerHeight; // put current window size in x (ie. 400)
});
</script>
I'm applying a repeated background image from a canvas to a div via javascript like this:
var img_canvas = document.createElement('canvas');
img_canvas.width = 16;
img_canvas.height = 16;
img_canvas.getContext('2d').drawImage(canvas, 0, 0, 16, 16);
var img = img_canvas.toDataURL("image/png");
document.querySelector('#div').style.backgroundImage = 'url(' + img + ')';
I have to update it quite frequently. The problem is it flickers upon change, it doesn't appear to happen in Chrome but it's really bad in Firefox and Safari. Is it possible to stop this? I didn't think it would happen since it's a dataurl and therefore doesn't need to be downloaded.
Solution:
// create a new Image object
var img_tag = new Image();
// when preload is complete, apply the image to the div
img_tag.onload = function() {
document.querySelector('#div').style.backgroundImage = 'url(' + img + ')';
}
// setting 'src' actually starts the preload
img_tag.src = img;
Try to preload the image resource to the device storage by including the image in DOM like in the following HTML-Code. Maybe the error comes up because the image resource need to be loaded which takes some time (flickering).
<img src="imageToPreload.png" style="display:none;" alt="" />
You may prefer to use sprites-images. By using sprites your application will need less HTTP-Requests to load all ressources into your page. Also add the following CSS styles if you are using css animations. It will prevent background flickering on mobile devices:
-webkit-backface-visibility: hidden;
-moz-backface-visibility: hidden;
-ms-backface-visibility: hidden;
Preload your image like this, no need to include a <img> with display: none
<link rel="preload" href="/images/bg-min.png" as="image">
Try adding this css to your background element:
-webkit-backface-visibility: hidden;
-moz-backface-visibility: hidden;
-ms-backface-visibility: hidden;
It should help with flickering..
You can also "force" hardware acceleration by adding this to your background element:
-webkit-transform: translate3d(0, 0, 0);
Another option is to use image instead of DIV and change only the image url.
I struggled with this for a bit, tried preloading, appending the image to the document, etc.
In the end, I resaved the JPEG without the "Progressive" option.
That fixed the rolling flicker when the img src was swapped.
In my case changing height: 1080px; (background height) to height: fit-content;
I think that preloading all the images is essential in any case. What I found is that the way the browsers behave while changing the background image dynamically is different from one another. In Firefox for example it flickers when the change is frequent however in Chrome and Safari it doesn't.
The best solution I came up with so far is drawing the image inside a child canvas that fills the space of the whole parent div.
In all cases, the images you are using must be optimized as it affects the rendering performance.
My javascript code that works now, looks like this
const pic = new Image();
const pic2 = new Image();
pic.src="../images/settings_referrals_anim.gif";
pic2.src="../images/settings_referrals_still.png";
I don't actually reference that code in the query, for example, i use
document.querySelector(".button_Settings_referrals").addEventListener("mouseover", function() {
myDiv.style.backgroundImage = "url('../images/settings_referrals_anim.gif')";
But it seems to work. If I replace the long URL with const pic for example it doesn't work, and if I include the image object declaration and location at first time in the assignment, then the flickering stops.
This does not address all of the specifics noted by the OP, but might be useful for others. Tested in Chrome 97, Firefox 96, Android 11, iOS 15.
I have a div that includes these CSS parameters...
#div_image {
background-image: url( [Path to low-res image] );
background-size: cover;
}
I have a corresponding class that looks like this...
.div_image_highres {
background-image: none !important;
}
The corresponding class has a pseudo-element defined as follows:
.div_image_highres::before {
position: absolute;
left: 0;
top: 0;
width: 100%;
height: 100%;
content: " ";
background-image: url( [Path to highres image] );
background-repeat: no-repeat;
background-position: 50% 0;
background-size: cover;
opacity: 1;
display: block;
}
I have an img element that also points to the high-res image...
<img id="img_highres_preload" src=" [Path to high-res image ] ">
The img element has a corresponding style which allows the image to be displayed (ensuring that image file loads) but not seen...
#img_highres_preload {
width: 1px;
height: 1px;
}
Two notes: (1) I realize a lot of people use other methods of pre-loading (e.g., programmatically), but I have a personal preference for this method. (2) See the addendum about the reliability of the load event.
Last but not least, I have some Javascript (jQuery) that adds the "high-res" class to "div_image" once the high-res file is loaded...
$(document).ready(function() {
$("#img_highres_preload").off().on("load", function() {
$("#div_image").addClass("div_image_highres");
});
});
This could easily be vanilla JS, but since I use jQuery throughout my code, I like having a consistency.
Here's a summary of what's happening...
Presumably, the low-res image is loaded first and becomes the background image for the div. Even if that does not occur, everything will work as intended (i.e., the high-res image will be displayed).
When the high-res image loads into the img element (i.e., Javascript confirms that the high-res file is loaded), the "div_image_highres" class is applied to "div_image".
As result, the div switches to the high-res image without flashing. In my experience, if anything, it shifts a little to the left; but that often doesn't occur and, if it does, it's not inelegant.
And here's the primary reason I use this approach when required: In my application, there are multiple panels the user can navigate, which results in one panel sliding out of view and the new one into view. If I don't use a pseudo-element (as described above) for displaying a high-res image, the image flickers when its div is hidden and re-displayed. With the above-described technique, I can slide the div in and out of view without any flickering.
Regarding the Load Event
You can't depend on the load event firing. For instance, it typically does not fire when the browser has cached an image. So to make a long post even longer, here's the enhancement I have in my code to accommodate that reality...
I modify the document.ready event (shown above) to look like this:
$(document).ready(function() {
positionOnPage(true);
$("#img_highres_preload").off().on("load", function() {
checkImage();
});
});
checkImage = function() {
var image = $("#img_highres_preload")[0];
if (!image.complete || (typeof image.naturalWidth != "undefined" && image.naturalWidth == 0)) {
console.log("Waiting for high-res image.");
}
else if (!$("#div_home").hasClass("div_home_highres")) {
$("#div_home").addClass("div_home_highres");
$("#img_highres_preload").remove();
}
}
The checkImage function examines the image element to see whether an image has in fact been loaded. In this code example, it is a little redundant — that is, the img element has confirmed the load, so there's usually no need to check it (unless there is some reason to believe the file is being misloaded).
I might do it as shown because I also call checkImage from other places in my code, so if I have more of a programmatic response (unlike the simple version shown), I want all of that code in the same place and written just once. The checkImage function might be called when triggered by a timer or when the section displaying the intended image is about to be displayed. Perhaps something like this...
if (sectionName == "[whatever]" && $("#img_highres_preload").length === 1) {
checkImage();
}
In this example, I look for the presence of the preload img element because I know that my previous function removes the element after it has fulfilled its purpose.
This post has a stripped-down version to illustrate the concept. As written above, it only accommodates a single known img element, so the code could be extended to call checkImage with some parameters (e.g., the name of an image or the element itself) and checkImage could look for the existence of the preload element, so that check occurs in one place. It can be fairly fancy, so I went with the simplest example for this post.
In many cases, this stripped-down version is all I need because typically I only use a high-res photo for a window background image. I either start with the display of a low-res image and switch it out as soon as the high-res file is loaded, or I have some animation that gets triggered after I confirm the presence of the high-res image.
A good case for a more generalized version is when I need a series of images loaded at the outset and don't want to start until all of them are ready. In those cases, the web page might begin with some welcome text that stays displayed until all images have been confirmed.
Hey Guys I know this has been an older question but if you are still flickering after all this you can simply put the final version behind you background div. That flicker is seeing behind the image you currently have so if its the final image it will be smooth.
I'm trying to build an image gallery in Safari that mimics the iPad photo app. It works perfectly, except that once I load more than 6MB or so worth of images either by adding them to the DOM or creating new Image objects, new images either stop loading or the browser crashes. This problem is widespread enough (with everyone else hitting up against the same limit) that I've ruled out my Javascript code as the culprit.
Given that you can stream much more than a few MB in a element or through the in-browser media player, this limit seems unnecessary, and there should be some kind of workaround available. Perhaps by freeing up memory or something else.
I also came across this reference for UIWebView.
"JavaScript allocations are also limited to 10 MB. Safari raises an exception if you exceed this limit on the total memory allocation for JavaScript."
Which matches what I'm seeing fairly well. Is it possible to deallocate objects in Javascript, or does Safari/UIWebView keep a running total and never lets go? Alternately, is there any workaround to load in data another way that doesn't eat up this 10MB?
Update: I think there's an even easier way to do this, depending on your application. Instead of having multiple images, if you simply have one <img> element or Image object (or maybe two, like a 'this' image and a 'next' image if you need animations or transitions) and simply update the .src, .width, .height and so on, you should never get near the 10MB limit. If you wanted to do a carousel application, you'd have to use smaller placeholders first. You might find this technique might be easier to implement.
I think I may actually have found a work-around to this.
Basically, you'll need to do some deeper image management and explicitly shrink any image you don't need. You'd normally do this by using document.removeChild(divMyImageContainer) or $("myimagecontainer").empty() or what have you, but on Mobile Safari this does absolutely nothing; the browser simply never deallocates the memory.
Instead, you need to update the image itself so it takes up very little memory; and you can do that by changing the image's src attribute. The quickest way I know of to do that is to use a data URL. So instead of saying this:
myImage.src="/path/to/image.png"
...say this instead:
myImage.src="data:image/gif;base64,AN_ENCODED_IMAGE_DATA_STRING"
Below is a test to demonstrate it working. In my tests, my large 750KB image would eventually kill the browser and halt all JS exectution. But after resetting src, I"ve been able to load in instances of the image over 170 times. An explanation of how the code works is below as well.
var strImagePath = "http://path/to/your/gigantic/image.jpg";
var arrImages = [];
var imgActiveImage = null
var strNullImage = "data:image/gif;base64,R0lGODlhEAAOALMAAOazToeHh0tLS/7LZv/0jvb29t/f3//Ub//ge8WSLf/rhf/3kdbW1mxsbP//mf///yH5BAAAAAAALAAAAAAQAA4AAARe8L1Ekyky67QZ1hLnjM5UUde0ECwLJoExKcppV0aCcGCmTIHEIUEqjgaORCMxIC6e0CcguWw6aFjsVMkkIr7g77ZKPJjPZqIyd7sJAgVGoEGv2xsBxqNgYPj/gAwXEQA7";
var intTimesViewed = 1;
var divCounter = document.createElement('h1');
document.body.appendChild(divCounter);
var shrinkImages = function() {
var imgStoredImage;
for (var i = arrImages.length - 1; i >= 0; i--) {
imgStoredImage = arrImages[i];
if (imgStoredImage !== imgActiveImage) {
imgStoredImage.src = strNullImage;
}
}
};
var waitAndReload = function() {
this.onload = null;
setTimeout(loadNextImage,2500);
};
var loadNextImage = function() {
var imgImage = new Image();
imgImage.onload = waitAndReload;
document.body.appendChild(imgImage);
imgImage.src = strImagePath + "?" + (Math.random() * 9007199254740992);
imgActiveImage = imgImage;
shrinkImages()
arrImages.push(imgImage);
divCounter.innerHTML = intTimesViewed++;
};
loadNextImage()
This code was written to test my solution, so you'll have to figure out how to apply it to your own code. The code comes in three parts, which I will explain below, but the only really important part is imgStoredImage.src = strNullImage;
loadNextImage() simply loads a new image and calls shrinkImages(). It also assigns an onload event which is used to begin the process of loading another image (bug: I should be clearing this event later, but I'm not).
waitAndReload() is only here to allow the image time to show up on the screen. Mobile Safari is pretty slow and displaying big images, so it needs time after the image has loaded to paint the screen.
shrinkImages() goes through all previously loaded images (except the active one) and changes the .src to the dataurl address.
I'm using a file-folder image for the dataurl here (it was the first dataurl image I could find). I'm using it simply so you can see the script working. You'll probably want to use a transparent gif instead, so use this data url string instead: data:image/gif;base64,R0lGODlhAQABAIAAAP///wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw==
The 6.5MB(iPad) / 10MB(iPhone) download limits are calculated based on the number of image elements used to set an image through its src property. Mobile safari doesn't seem to differentiate images loaded from cache or via the network. It also doesn't matter whether the image is injected into the dom or not.
The second part to the solution is that mobile safari seems to be able to load an unlimited number of images via the "background-image" css property.
This proof of concept uses a pool of precacher's which set the background-image properties once successfully downloaded. I know that it's not optimal and doesn't return the used Image downloader to the pool but i'm sure you get the idea :)
The idea is adapted from Rob Laplaca's original canvas workaround http://roblaplaca.com/blog/2010/05/05/ipad-safari-image-limit-workaround/
<!DOCTYPE html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
<title>iPad maximum number of images test</title>
<script type="text/javascript">
var precache = [
new Image(),
new Image(),
new Image(),
new Image()
];
function setImage(precache, item, waiting) {
precache.onload = function () {
item.img.style.backgroundImage = 'url(' + item.url + ')';
if (waiting.length > 0) {
setImage(precache, waiting.shift(), waiting);
}
};
precache.src = item.url;
}
window.onload = function () {
var total = 50,
url = 'http://www.roblaplaca.com/examples/ipadImageLoading/1500.jpg',
queue = [],
versionUrl,
imageSize = 0.5,
mb,
img;
for (var i = 0; i < total; i++) {
mb = document.createElement('div');
mb.innerHTML = ((i + 1) * imageSize) + 'mb';
mb.style.fontSize = '2em';
mb.style.fontWeight = 'bold';
img = new Image();
img.width = 1000;
img.height = 730;
img.style.width = '1000px';
img.style.height = '730px';
img.style.display = 'block';
document.body.appendChild(mb);
document.body.appendChild(img);
queue.push({
img: img,
url: url + '?ver=' + (i + +new Date())
});
}
//
for (var p = 0; p < precache.length; p++) {
if (queue.length > 0) {
setImage(precache[p], queue.shift(), queue);
}
}
};
</script>
</head>
<body>
<p>Loading (roughly half MB) images with the <strong>img tag</strong></p>
</body>
</html>
So far I've had luck using <div> tags instead of <img> tags and setting the image as the div's background-image.
All in all, it's crazy. If the user is making an affirmative request for more image content, then there's no reason why Safari shouldn't allow you to load it.
I've had luck starting with the suggestion of Steve Simitzis, and Andrew.
My project:
PhoneGap-based app with 6 main sections, and about 45 subsections which have a jquery cycle gallery of between 2 and 7 images, each 640 x 440 (215+ images altogether). At first I was using ajax to load page fragments, but I've since switched to a one-page site, with all sections hidden until needed.
Initially, after going through about 20 galleries, I was getting memory warning 1, then 2, then the crash.
After making all the images into divs with the image applied as a background, I could get through more galleries (about 35) in the app before a crash, but after going to previously visited galleries, it would eventually fail.
The solution that seems to be working for me, is to store the background image URL in the div's title attribute, and setting all of the background images to be a blank gif. With 215+ images, I wanted to keep the url someplace in the html for sake of ease and quick reference.
When a subnavigation button is pressed, I rewrite the css background image to the correct source which is contained in the div's title tag, for ONLY the gallery that is showing. This saved me from having to do any fancy javascript to store the correct source image.
var newUrl = $(this).attr('title');
$(this).css('background-image', 'url('+newUrl+')');
When a new subnavigation button is pressed, I rewrite the background image of the last gallery divs to be blank gifs. So, aside from interface gfx, I only have 2-7 images 'active' at all times. With anything else I add that contains images, I just use this "ondemand" technique to swap the title with the background-image.
Now it seems I can use the app indefinitely with no crashes. Don't know if this will help anyone else, and it may not be the most elegant solution, but it provided a fix for me.
On a rails app, I was lazy loading hundreds of mid-size photos (infinite scroll) and inevitably hit the 10Mb limit on the iphone. I tried loading the graphics into a canvas (new Image, src=, then Image.onload) but still hit the same limit. I also tried replacing the img src and removing it (when it went out of viewable area) but still no cigar. In the end, switching out all the img tags w/ div's w/ the photo as background did the trick.
$.ajax({
url:"/listings/"+id+"/big",
async:true,
cache:true,
success:function(data, textStatus, XMLHttpRequest) {
// detect iOS
if (navigator.userAgent.match(/iPhone/i) || navigator.userAgent.match(/iPod/i) || navigator.userAgent.match(/iPad/i)) {
// load html into data
data = $(data);
// replace img w/ div w/ css bg
data.find(".images img").each(function() {
var src = $(this).attr("src").replace(/\s/g,"%20");
var div = $("<div>");
div.css({width:"432px",height:"288px",background:"transparent url("+src+") no-repeat"});
$(this).parent().append(div);
$(this).remove();
});
// remove graphic w/ dynamic dimensions
data.find(".logo").remove();
}
// append element to the page
page.append(data);
}
});
I can now load well over 40Mb of photos on one page w/o hitting the wall. I encountered an odd issue, though, with some of the css background graphics failing to show up. A quick js thread fixed that. Set the div's css bg property every 3 sec's.
setInterval(function() {
$(".big_box .images div.img").each(function() {
$(this).css({background:$(this).css("background")});
});
}, 3000);
You can see this in action at http://fotodeck.com. Check it out on your iphone/ipad.
I was unable to find a solution for this. Here are a couple of methods I tried, and all of them failed:
Simply changed the background of a DIV using div.style.backgroundImage = "url("+base64+")"
Changed the .src of an image using img.src = base64
Removed the old and added the new image using removeChild( document.getElementById("img") ); document.body.appendChild( newImg )
The same as above but with a random height on the new image
Removing and adding the image as a HTML5 canvas object. Also doesn't work, since a new Image(); has to be created, see *
On launch, created a new Image() object, let's call it container. Displayed the image as <canvas>, every time the image changed, I would change container's .src and redraw the canvas using ctx.drawImage( container, 0,0 ).
The sames as the previous, but without actually redrawing the canvas. Simply changing the Image() object's src uses up memory.
A strange thing I noticed: The bug occurs even if the image isn't displayed! For example, when doing this:
var newImg = new Image( 1024, 750 );
newImg.src = newString; // A long base64 string
Every 5 seconds, and nothing else, no loading or displaying the image, of course wrapped up in an object, also crashes the memory after some time!
I encountered an out of memory with Javascript on the iPad when we were trying to refresh an image very often, like every couple of seconds. It was a bug to refresh that often, but Safari crashed out to the home screen. Once I got the refresh timing under control, the web app functioned fine. It seemed as if the Javascript engine couldn't keep up with garbage collection quickly enough to discard all the old images.
There are issues with memory and the way to solve this problem is very simple. 1) Put all your thumbnails in canvas. You will be creating a lot of new Image objects and drawing them into canvas, but if your thumbnail are very small you should be fine. For the container where you will be displaying the real size image, create only one Image object and reuse this object and make sure to also draw it into a canvas. So, every time a user clicks the thumbnail, you will update your main Image object. Do not insert IMG tags in the page. Insert CANVAS tags instead with the correct width and height of the thumbnails and the main display container. iPad will cry foul if you insert too many IMG tags. So, avoid them!!! Insert only canvas. You can then find the canvas object from the page and get the context. So every time the user clicks a thumbnail, you will get the src of the main image (real size image) and draw it to the main canvas, reusing the main Image object and the firing the events. Clearing the events every time at the beginning.
mainDisplayImage.onload = null;
mainDisplayImage.onerror = null;
...
mainDisplayImage.onload = function() { ... Draw it to main canvas }
mainDisplayImage.onerror = function() { ... Draw the error.gif to main canvas }
mainDisplayImage.src = imgsrc_string_url;
I have create 200 thumbnails and each is like 15kb. The real images are like 1 MB each.
I also had similar problems while rendering large lists of images on iPhones.
In my case displaying even 50 images in the list was enough to either crash the browser or occasionally the entire operating system. For some reason any images rendered onto the page weren't garbage collected, even when pooling and recycling just a few onscreen DOM elements or using the images as background-image property. Even displaying the images directly as Data-URIs is enough to count towards the limit.
The solution ended up being rather simple - using position: absolute on the list items allows them to be garbage collected fast enough to not run into a memory limit. This still involved on having only about 20-30 images in the DOM at any moment, creating and removing the item's DOM nodes by scroll positon finally did the trick.
It seems it's particularily dependent on having webkit-transform':'scale3d() applied to any ancestor of the images in the DOM. Relatively flowing a very tall DOM and rendering it on the GPU pisses off a memory leak in webkit renderer, I guess?
I'm running in a similar issue in Chrome too, developing an extension that loads images in the same page (the popup, actually) replacing old images with new ones.
The memory used by the old images (removed from the DOM) is never freed, consuming all the PC memory in a short time.
Have tried various tricks with CSS, without success.
Using hardware with less memory than a PC, like the iPad, this problem arises earlier, naturally.
I filed a bug with jQuery as jQuery trys to handle memory leaks...so I'd consider this a bug. Hopefully the team can come up with some concise and clever way of handling this problem in Mobile Safari soon.
http://dev.jquery.com/ticket/6944#preview