Many images loading performance HTML/Javascript - javascript

Ok so after some searching and not finding the real anwser I was looking for I came with the following question in this situation:
I have a trading website that loads about 2300 PNG images of 37x50 twice, once in a left column and once in a right column. The images and all information that comes with it is inserted using jQuery on the document on the onLoad event. However loading 2300 images (even when the html came straight from the server) takes just TOO much time and even hangs new versions of chrome!. So now the quick solution was just to remove the images and show in a dynamic tooltip. Works great but got angry website users and it is indeed ugly.
So... I thought of some possible solutions but I have no idea what is good/bad practice here:
Make all images JPEG and reduce quality.
With the above or not: Add all images to 1 very large image, load it and draw 4600 canvasses based on locations in an array like 'imageArray["someimageID"] = { x = 0, y = 40 }'
Convert all images to base64, add them in an array 'imageArray["someimageID"] = "base64"' and draw 4600 canvasses.
And to an extend where I must think of as well that of all those 2300 images I have a small, medium and large version. (of which only the small ones, 37x40, are shown all together in a page)
Hope to get some nice insights on how to correctly solve such a problem!
Greets

If your images are static (not generated for every request) I think you should use CSS sprites. (similar to your own suggestion of lots of canvases).
Basically you create a div for each image you want to show (or some other container element) and set a background on it that takes a small portion of the big image that contains all images.
Example css:
img.icon1
{
width:50px;
height:50px;
background:url(spritesheet.png) 50 0;
}
in this example the 50 and 0 indicate the offset of your 50x50 icon in the spritesheet.
Update: Here http://css-tricks.com/css-sprites/ is an explanation that goes a bit further than my simple example.

First off, consider whether or not you actually need this many images, and loaded on the page all at once. Assuming you do...
Make all images JPEG and reduce quality.
Use the right format for what you're doing. JPEG is for photos. My guess is that since you have 37x50 pixel images that you're not showing photos. PNG is likely smaller from a file-size perspective in this case. It doesn't matter a whole lot though because the speed issue you're having is probably 80% browser time, 20% network time.
With the above or not: Add all images to 1 very large image, load it and draw 4600 canvasses based on locations in an array like 'imageArray["someimageID"] = { x = 0, y = 40 }'
Try it and see. I don't think this is going to help you. A canvas will have more overhead than a simple image.
Convert all images to base64, add them in an array 'imageArray["someimageID"] = "base64"' and draw 4600 canvasses.
Don't do this. You're adding 33% overhead to the file size, and again the load problem is mostly in your browser.
What you can do
Really question again whether or not you need this many images in the first place.
Cut down on network requests by using several hostnames to load the images. image1.example.com, image2.example.com, image3.example.com, etc. This will allow for more network requests in parallel.
Use your developer tools to verify where the problem actually is. Again, I suspect it's mostly client-side. Once you know the real problem, you can solve it directly.

I would advise if you can, creating a very low resolution sprite of images that can be placed to make it look like everything is loaded, then replace this with the proper images. Without better code/samples/what your images contain/ are they dynamic I am unable to give you a real answer with solution but at least it can lead you in the correct direction.
If your images are static, this will work fine, if dynamic there is not much else that can be done. I think some examples of the webpage you are creating would be great

The reason you're having problems is simply a massive amount of HTTP requests - something you should always be trying to minimize.
Like others are saying here, you're going to want to use a spritesheet technique if possible (it sounds like it is). A spritesheet will condense all of your images into one, removing 2299 of your HTTP requests.

Related

Speed up web page by compressing large images

I have a blog slideshow on my web page which accesses images from a given url.
The problem is, people add images with massive resolution (3000px*6000px), which noticeably slows down the animation of the slideshow.
These high resolution images are necessary, but not for this particular purpose, since the images live inside a div of size (300*600)
Is there any way CSS (or some other way) can convert the image to a smaller specified resolution (say 300px * 600px), before scaling down.
This way the animation won't involve high res image frames and so it won't be as laggy.
The only alternative I can think of, is that everytime an image is uploaded on the database, the backend creates a secondary compressed image for this purpose.
However, this seems like a lot of effort.
Since you said in the comments that bandwidth and download time are not the issue and it's acceptable to download the full res image, scale it down, then add it to the page, please consider the following solution which does exactly that.
Download the original image via AJAX, then use ctx.drawImage to draw the image to an HTML canvas with much smaller size. For instance, you can take a 3000px*6000px image and draw it scaled onto a 300px*600px canvas. Then free the original image using JavaScript so that it no longer takes up memory in the browser.
After that, you can use the canvas to do your animations and there should not be nearly as much lag as using the large, original image (since the compositor will need to move much fewer pixels).
Edit: According to your later comment, your users are uploading to an external image hosting service, so this solution will prevent them from having to upload a thumbnail version in addition to their full-res version.
If you choose Imgur.com like you are considering in the comments: They let you modify the image size a bit in the URL. So for instance if you have an image at https://i.imgur.com/9ZC02Os.jpg, you can use https://i.imgur.com/9ZC02Oss.jpg for the small version, https://i.imgur.com/9ZC02Osm.jpg for the medium version, and https://i.imgur.com/9ZC02Osl.jpg for the large scaled version (note the s, m, and l at the end of the URLs). That way you can probably avoid drawing to a canvas completely.
This was a site I found a few years agowhich may be of interest: http://sneak.co.nz/projects/img-resizing/
You could store a small version on your site for use in the slideshow. A good way to do this may be to check if a small image is available and if not create it the first time it is called and save it somewhere.
This code will resize an image on the fly but I think you would still have some lag while the image is resized.
$photo="sunflower.jpg";
$cmd = "convert $photo -thumbnail 200x200 ".
" -quality 100 JPG:-";
header("Content-type: image/jpeg");
passthru($cmd, $retval);

Client-side image compression

I've made an image gallery, but the browser lags as it renders images. Six images are rendered at one time, but because these images are full-size jpegs taken straight from people's phones or cameras they are often large and it causes a lot of lag.
I'd like to save the images to the server in their full size to reduce any loss of quality, however this is obviously not ideal when rendering small previews (only about 330px wide).
What I therefore would like to do (unless there's a much better way of approaching this; perhaps something server-side with PHP?) is reduce the image sizes to a few hundred KB rather than a few thousand on the client just before they are actually loaded onto the page.
I hope that makes sense, and I hope I'm not being really stupid and missing something really obvious, which is what it feels like. Help is always appriciated.
I advice you tu use Echo.js , to avoid all image loading issues
Lazy-loading works by only loading the assets needed when the elements
‘would’ be in view, which it’ll get from the server for you upon
request, which is automated by simply changing the image src
attribute. This is also an asynchronous process which also benefits
us.
Here is a DEMO using Echo.js which load the image only when it show up on the view port !
echo.init( {
offset: 10,
throttle: 550
} );

Using a large image (file size) but not hinder load time?

My demo is here.
Basically, I have a HUGE image (19160px × 512px to be exact, just under 2MB) that I transition the backgroundx using javascript to make it appear as if a transformation was happening.
I cannot compress the image much more without ruining its quality dramatically. Is there another way that I can achieve this with the same level of cross-browser and not rely on plugins like flash, but have it load faster?
Have you considered making this a video?
It might improve loading time somewhat.
Also, another idea. Have you tried using only the first and last image, putting the last one on top of the first, give it opacity:0 and fade it in using JavaScript (e.g. jQuery)?
The effect won't be 100% identical to what you have now, but it might look good enough to please the client, and it would reduce loading time to a bare minimum.
If both ideas won't work for you, I think the first 10-12 frames could be compressed more effectively as GIF images. (It's an estimate, I haven't tried.) You would have to split the image into multiple div s to do that and change the method you use to switch the images, and you would have more requests, but it could be worth it.
If it is a jpeg, you can always use progressive encoding. it will become clearer as it is downloaded.
There is also an interlaced
"Progressive JPEG" format, in which
data is compressed in multiple passes
of progressively higher detail. This
is ideal for large images that will be
displayed while downloading over a
slow connection, allowing a reasonable
preview after receiving only a portion
of the data. -Wikipedia
Slice it like Google Maps.
If you want to change that many pixels on the screen at once, you'll have to get them to the client somehow. You could chunk it into multiple images and use something other than background-x, but then you expose yourself to other potential network interruptions along the way.
The only alternative I can think of to precomputed images like this one is to do the computation on the client - start with the full-colour image and manipulate it using the client's CPU. Your options here involve canvas or CSS3 or a plugin.
I'm not a big fan of Flash but in this case it seems like the right tool for the job (unless you need it work on the iPhone). If you don't have the Flash authoring tool you can use the free Flex compiler.
See http://www.insideria.com/2008/03/image-manipulation-in-flex.html
Make it into an animated gif? Break it up into individual parts to remove all the area that is obscured by content.

I want to load multiple images very fast on a website, what's the best method?

UPDATE: This question is outdated, please disregard
So.. my idea is to load a full manga/comics at once, with a progress bar included, and make sort of a stream, like:
My page loads the basic (HTML+CSS+JS) (of course)
As done, I start loading the imgs(the URLs are stored on JS var) from my server, one a time (or some faster way) so I can make a sort of progress bar.
ALTERNATIVE: Is there a way to load a compresses file with all imgs and uncompress at the browser?
ALTERNATIVE: I was also thinking of saving then as strings and then decode, they are mostly .jpg
The images don't have to show right away, i just need the callback when they are done.
XTML and HTML5 is acceptable
What is the fastest way to load a series of images for my website?
EDIT
Since #Oded comment.. the question is truly what is the best tech for loading images and the user don't have to wait everytime is turns the 'page'. Targeting a more similar experience like when you read comics in real life.
EDIT2
As some people helped me realize, I'm looking for a pre-loader on steroids
EDIT3
No css techs will do
If you split large images into smaller parts, they'll load faster on modern browsers due to pipelining.
ALTERNATIVE: Is there a way to load a compresses file with all imgs and uncompress at the browser?
Image formats are already compressed. You would gain nothing by stitching and trying to further compress them.
You can just stick the images together and use background-position to display different parts of them: this is called ‘spriting’. But spriting's mostly useful for smaller images, to cut down the number of HTTP requests to the server and somewhat reduce latency; for larger images like manga pages the benefit is not so large, possibly outweighed by the need to fetch one giant image all at once even if the user is only going to read the first few pages.
ALTERNATIVE: I was also thinking of saving then as strings and then decode
What would that achieve? Transferring as string would, in most cases, be considerably slower than raw binary. Then to get them from JavaScript strings into images you'd have to use data: URLs, which don't work in IE6-IE7, and are limited to how much data you can put in them. Again, this is meant primarily for small images.
I think all you really want is a bog-standard image preloader.
You could preload the images in javascript using:
var x = new Image();
x.src = "someurl";
This would work like the one you described as "saving the image in strings".
Spriting
Just have a look how facebook does it: http://b.static.ak.fbcdn.net/rsrc.php/z3JQK/hash/11cngjg0.png
One image that loads FASTER than series of small images. To display the icon you simply create a div with fixed dimensions, and move the background inside it. Your div works as a viewport for the big image. You use background-position to move to appropriate part of the image. Everything else is hidden.
Different domains
Something you probably didn't know - Internet Explorer has a limit of connections per server. You can read about it here: http://support.microsoft.com/?scid=kb;en-us;183110&x=17&y=11 (here are exact numbers).
What it means - if user is using IE7, he will be able to load ONLY 4 (or 2) files at the same time from your server regardless his internet connection speed.
To speed things up, you could create few subdomains: server1.mydomain.com, server2.mydomain.com, server3.mydomain.com etc - and then user can download many files a lot quicker, because you use different hosts to serve different files.
As done, I start loading the imgs(the
URLs are stored on JS var) from my
server, one a time (or some faster
way) so I can make a sort of progress
bar.
Your browser already downloads the HTML first, that's how it knows to load any JS/images you reference. You are trying to invent something that already exists.
Just make sure your manga is made up of lots of images of a known size, which you specify in your img tags. Most browsers have some sort of progress bar to show that it's loading resources for you. You're not going to make loading large images faster unless you improve either the speed at which your server serves them, or your user's internet connection, or you compress them to make your image files smaller (likely at the cost of image quality).
Note: JPG and PNG are already compressed.
You can try using a "CSS sprites" technique. Basically the idea is you use your favorite image editing program to stich all your images into a single image. It's faster to send this because you lose the per/file overhead in terms of encoding the image and sending the image. On the client side you use CSS to only select the portion of the total image that is used in any one place.
http://www.alistapart.com/articles/sprites/
http://www.fiftyfoureleven.com/weblog/web-development/css/css-sprites-images-optimization
AND/OR
You can use lazy loading to only load images when they come into view.
http://www.appelsiini.net/projects/lazyload
Image preloaders have been around for ages. You really do not need to load them all at once, you can do it on demand [when the person loads the next page, you can fetch the image after it]
My page loads the basic (HTML+CSS+JS) (of course)
As done, I start loading the imgs(the URLs are stored on JS var) from my
server, one a time (or some faster way) so I can make a sort of progress bar.
The images don't have to show right away, i just need the callback
when they are done.
If you want to load 10 images as fast as possible, place 10 <img> tags on the page, one for each image. Use Javascript to hide all the but the currently viewed image; add next/back links that use JS to hide the current image and show the next one. Many browser already have some form of progress bar, and by doing things with regular old HTML, it will function correctly.
You're trying to re-invent all this functionality with Javascript for no good reason. You're not going to do it better than the browser.
All that said, this is probably a bad idea. You might dump 15MB of comic pages into the browser window only to have the user leave after reading the first page. Rather than trying to pre-load all images, you should use JS to always keep the next page (or two) pre-loaded, not the entire thing.
Here's something you can try, which by happenstance I just coded up:
(function() {
var imgs = [ "image1.png", "image2.png", ... /* all your image names */ ],
index = 0,
img;
function loader() {
if (index >= imgs.length) return;
(img = new Image()).onload = loader;
setTimeout(function() { img.src = "/path/to/images/" + imgs[index++]; }, 1);
}
loader();
})();
Plop all your image names (or the ones you want to preload) into the array, and make sure this script starts up when your page(s) start loading. It'll work its way through the list of images, loading them, and then moving on to the next one when each image finishes. (The setTimeout call is to make sure that the "onload" handler doesn't get called while you're still inside a handler.)
You'd probably want to do this for lots of the "nuts and bolts" images for your whole site - in other words, each page would try to load images for everything. Once they're in the cache, of course, this won't take a significant amount of time. Alternatively, you could run this script only on a couple pages, like "login" screens and the main "home" page. Of course, if you've got a site like Flickr, then you probably wouldn't want to preload all your images :-)

better quality thumbnails from larger image files

I'm showing images from other websites as thumbnails. To do this I display them in a smaller img tag so the browser does the size decrease.
The problem is that the quality of these images (which I have no control of) is diminished.
Also they look much better in FF and Safari than in IE.
Is there a way to make these images look better without caching them on my server? (e.g a javascript library that does the resize with better quality)? Any idea is highly appreciated.
IE's default image resizing algorithm is not that nice - it can be changed by tweaking a registry entry, but of course that is outside of your control.
However, apparently it can also be triggered to do a better image resize through css
img { -ms-interpolation-mode: bicubic; }
source: http://edmondcho.com/blog/2009/03/17/internet-explorer-image-resizing-tip/
A quick Google search shows that in IE7 you can fix the image quality problem:
http://devthought.com/tumble/2009/03/tip-high-quality-css-thumbnails-in-ie7/
The only way to have control is to do the resizing yourself. Various browsers will use different algorithms, some with unsharp masking, some without. The filters used after resizing control most of this. Specific CSS tagging can control this to some extent.
Javascript can't really handle this, but using Flash or similar would allow this. You would have better control of the image. However, you would lose the "imageness" as far as HTML.
One thing I didn't see mentioned by the others - you aren't really resizing the image, you are just displaying it in a smaller space. Let's say you are pulling down an extremely large image file (5MB) and displaying it at 1 x 1 - it's still 5MB!
Writing a caching solution for these images wouldn't be very difficult at all - and will save you the legal ramifications and embarrassment. If I saw your site in my log files and realized you were pulling down my images, you would be Goatse'd - hard.
If you are working with a source image and simply re-sizing on the client, there isn't going to be a good way to do this.
Now, aside from the potential legal ramifications of using other sites images you could look at a simple caching process, and do a quick re-size on the image, and keep the aspect ratio, so that the display is good. This also helps reduce the bandwidth that you are using from the other sites.

Categories

Resources