Speed up web page by compressing large images - javascript

I have a blog slideshow on my web page which accesses images from a given url.
The problem is, people add images with massive resolution (3000px*6000px), which noticeably slows down the animation of the slideshow.
These high resolution images are necessary, but not for this particular purpose, since the images live inside a div of size (300*600)
Is there any way CSS (or some other way) can convert the image to a smaller specified resolution (say 300px * 600px), before scaling down.
This way the animation won't involve high res image frames and so it won't be as laggy.
The only alternative I can think of, is that everytime an image is uploaded on the database, the backend creates a secondary compressed image for this purpose.
However, this seems like a lot of effort.

Since you said in the comments that bandwidth and download time are not the issue and it's acceptable to download the full res image, scale it down, then add it to the page, please consider the following solution which does exactly that.
Download the original image via AJAX, then use ctx.drawImage to draw the image to an HTML canvas with much smaller size. For instance, you can take a 3000px*6000px image and draw it scaled onto a 300px*600px canvas. Then free the original image using JavaScript so that it no longer takes up memory in the browser.
After that, you can use the canvas to do your animations and there should not be nearly as much lag as using the large, original image (since the compositor will need to move much fewer pixels).
Edit: According to your later comment, your users are uploading to an external image hosting service, so this solution will prevent them from having to upload a thumbnail version in addition to their full-res version.
If you choose Imgur.com like you are considering in the comments: They let you modify the image size a bit in the URL. So for instance if you have an image at https://i.imgur.com/9ZC02Os.jpg, you can use https://i.imgur.com/9ZC02Oss.jpg for the small version, https://i.imgur.com/9ZC02Osm.jpg for the medium version, and https://i.imgur.com/9ZC02Osl.jpg for the large scaled version (note the s, m, and l at the end of the URLs). That way you can probably avoid drawing to a canvas completely.

This was a site I found a few years agowhich may be of interest: http://sneak.co.nz/projects/img-resizing/
You could store a small version on your site for use in the slideshow. A good way to do this may be to check if a small image is available and if not create it the first time it is called and save it somewhere.
This code will resize an image on the fly but I think you would still have some lag while the image is resized.
$photo="sunflower.jpg";
$cmd = "convert $photo -thumbnail 200x200 ".
" -quality 100 JPG:-";
header("Content-type: image/jpeg");
passthru($cmd, $retval);

Related

Rendering Multiple Images in React and HTML

I have a problem rendering images with large file sizes. When I upload a lot of large file sizes of images and display them, it becomes very laggy.
My question is:
The ideal way is the backend should provide an very small image file size url?
Or the frontend can do it using Canvas API?
Could you provide an explanation please? Thank you
If you have a bunch of thumbnails to display, the source images for those thumbnails should definitely not be the original, full-size images. If those images are large, it will take a long time for the client to download them, no matter how you render them on the client.
When figuring out an image to be shown to client, you should have two versions on the server:
Thumbnail version - low resolution, small file size, easy to download and render many at once
Full-size version, downloaded when the user wants to zoom in on one of them
It could be that the full-size version should not necessarily be the original image. For example, if the original image is 20MB (yes, they can exist), you wouldn't want to burden clients with that. So you'd want to perform resizes and optimizations on the server for both the thumbnail version and the "full" version - enough such that there isn't much of a delay between when the client tries to zoom in and the full-size image fully loads.
My recommendation is that you convert the image type to something more performant like .webp or .jpeg and give the element the exact width and height properties.
And react also have a feature to lazy load your images using react.lazy() that can boost your web performance significantly
Handling large images is too much work for a frontend client. You should get these images at a smaller size in order to just show them as they are.
Imagine someone trying to use your application with an old computer, or even an old smartphone, you shouldn't rely on client's processing power to handle all this heavy work.
I hope my answer helps you!
Jpegs are what you should be using, look at functionPreload() as well

Reasonable limit on SVG for website background, or ideas as to why this is not loading otherwise?

I'm making the assumption that the following is the result of the image's size:
I have two 10MB SVG (yes, they're huge), given to me by a graphic designer, that are supposed to act as background images for a client's landing page. It is a rather complex graphic that I don't think is suitable for SVG. I have tried to call them in a variety of ways without success, and have several (very small) svg images that render without issue, called into the DOM via:
<img id="logoImg" src="SvgLogo.svg" alt="OurLogo">
I imagine having a 10MB image is problematic for other reasons (though I haven't gzip'd); however, does anyone have any ideas if the size is the root of my issue?
10MB svg is definitely a problem.
If you don't need transparency .gif is a good compressible format.
If you need transparency you can shoot for .png but will be slightly bigger than gif.
Also when you export; make sure the correct size will fit on the website. Don't use width and height in the image tag.
If you go png then you can use a program like pngcrush

Many images loading performance HTML/Javascript

Ok so after some searching and not finding the real anwser I was looking for I came with the following question in this situation:
I have a trading website that loads about 2300 PNG images of 37x50 twice, once in a left column and once in a right column. The images and all information that comes with it is inserted using jQuery on the document on the onLoad event. However loading 2300 images (even when the html came straight from the server) takes just TOO much time and even hangs new versions of chrome!. So now the quick solution was just to remove the images and show in a dynamic tooltip. Works great but got angry website users and it is indeed ugly.
So... I thought of some possible solutions but I have no idea what is good/bad practice here:
Make all images JPEG and reduce quality.
With the above or not: Add all images to 1 very large image, load it and draw 4600 canvasses based on locations in an array like 'imageArray["someimageID"] = { x = 0, y = 40 }'
Convert all images to base64, add them in an array 'imageArray["someimageID"] = "base64"' and draw 4600 canvasses.
And to an extend where I must think of as well that of all those 2300 images I have a small, medium and large version. (of which only the small ones, 37x40, are shown all together in a page)
Hope to get some nice insights on how to correctly solve such a problem!
Greets
If your images are static (not generated for every request) I think you should use CSS sprites. (similar to your own suggestion of lots of canvases).
Basically you create a div for each image you want to show (or some other container element) and set a background on it that takes a small portion of the big image that contains all images.
Example css:
img.icon1
{
width:50px;
height:50px;
background:url(spritesheet.png) 50 0;
}
in this example the 50 and 0 indicate the offset of your 50x50 icon in the spritesheet.
Update: Here http://css-tricks.com/css-sprites/ is an explanation that goes a bit further than my simple example.
First off, consider whether or not you actually need this many images, and loaded on the page all at once. Assuming you do...
Make all images JPEG and reduce quality.
Use the right format for what you're doing. JPEG is for photos. My guess is that since you have 37x50 pixel images that you're not showing photos. PNG is likely smaller from a file-size perspective in this case. It doesn't matter a whole lot though because the speed issue you're having is probably 80% browser time, 20% network time.
With the above or not: Add all images to 1 very large image, load it and draw 4600 canvasses based on locations in an array like 'imageArray["someimageID"] = { x = 0, y = 40 }'
Try it and see. I don't think this is going to help you. A canvas will have more overhead than a simple image.
Convert all images to base64, add them in an array 'imageArray["someimageID"] = "base64"' and draw 4600 canvasses.
Don't do this. You're adding 33% overhead to the file size, and again the load problem is mostly in your browser.
What you can do
Really question again whether or not you need this many images in the first place.
Cut down on network requests by using several hostnames to load the images. image1.example.com, image2.example.com, image3.example.com, etc. This will allow for more network requests in parallel.
Use your developer tools to verify where the problem actually is. Again, I suspect it's mostly client-side. Once you know the real problem, you can solve it directly.
I would advise if you can, creating a very low resolution sprite of images that can be placed to make it look like everything is loaded, then replace this with the proper images. Without better code/samples/what your images contain/ are they dynamic I am unable to give you a real answer with solution but at least it can lead you in the correct direction.
If your images are static, this will work fine, if dynamic there is not much else that can be done. I think some examples of the webpage you are creating would be great
The reason you're having problems is simply a massive amount of HTTP requests - something you should always be trying to minimize.
Like others are saying here, you're going to want to use a spritesheet technique if possible (it sounds like it is). A spritesheet will condense all of your images into one, removing 2299 of your HTTP requests.

Are there any good Javascript/Jquery thumbnail script equivalents to TimThimb (PHP)?

For those unaware of TimThumb, it will take any image, of any size or dimension and create a thumbnail on the fly to any desired size. The beauty of it is that it really works on any dimension you feed it through a combination of either resizing the image, cropping or zoom cropping the image.
Ive been searching for jscript equvalents but they either require the user to actually mask out the thumbs manually (looking for a script that automatically does it to images) or the scripts can't handle images in a different aspect ratio.
Thanks for any leads on this!
It is impossible to do this only with client-side javascript. PHP has GD, ImageMagick libraries which create the new image (actual thumbnail) and javascript alone can't do this, as it is client side script, it can't create files.
So the answer is: There is no any.
As #papirtiger pointed out you can still do it with server-side javascript (such as node.js).
Please see this link
It depends.
You can use CSS or Javascript for simple image scaling.
There are tons of available plugins to this.
I doubt that there is one that does the guesstimation exactly the same as timthumb.
If you are going resize a large amount images on the page it will really hurt performance.
Another alternative is to several fixed size "layouts" (960, 320) etc and have the server generate thumbs for each.
You can than use javascript to load the appriate size.
If you really need to rescale the file:
Use external webservice to resize the image.
Most of them take a url and return a resized image:
example.com/resize?image="http://example.com/image1.jpg"&height="...
If you have TimThumb running on your server you can set up a simple API to allow you to call your own service.
othrwise see Image resizing web service for a few alternatives.

Bicubic-ly Resize Images with Javascript

Is there a way to resize the actual image using JavaScript? I'm not talking about modifying the DOM to get the browser to resize the image. I actually want to resize the image's pixel data and then display that.
Basically my problem is: Firefox completely fails at downsizing images with delicate features because it only has Nearest Neighbor and Binlinear. Every other browser -- even IE -- has Bicubic support. There's talk of this being included in the near future but that talk has been going on for over a year.
I don't mind downloading the full sized images because I want them downloaded anyway. When the user hovers over the small version of an image, the large version immediately appears elsewhere on the page. If I did server-side resizing I'd have to download BOTH copies of the images which would result in even more traffic. If there's no other workaround then this is what I'm going to have to do... I just don't want to.
It is possible. You get a image on the same domain, write to canvas, then manipulate pixel data from there (complicated, but possible I'm sure), and then either use that or output as png/gif/jpg... BUT... I don't think you will find it will better way to preserve delicate features that CSS.

Categories

Resources