JS: check if hidden image is still in browser's cache? - javascript

Our web-page contains a 360 angle product viewer based on 36 product images. Only one image is showed at a time, all other are hidden. All images are served from AWS S3 bucket with no cache directive (and that is how it should be). On the first start JS plugin shows preloader for 36 images and
everything works perfect after loading.
Problem comes when the web tab (tested only in chrome) remains open for a long time (several hours) when user works in another tabs. Those hidden images are removing from the cache, and JS script reloads all of them again and 360 drug looks odd (not smooth). After browser loads all absent images it starts to work smooth again and after few hours of inactivity it repeats again.
This behaviour is what we expect, but we want somehow to check if hidden images are not cached anymore to invoke preloader.
I searched the web and stackoverflow for the answer, but all other cache related question are not answering my question, like "check if image is cached after reopening browser or cache". My question is: how to check if hidden image is still in cache?
Example code appreciated. Thanks!
PS - I know how to enable cache headers for images delivered from S3 and that is not an option.
PSS - Creating 1px image holders is not an option too.

As you mentioned in your comments - you don't want to mess with your preloader code and don't want to start your preloader each time.
The answer to your question is "no, there is no 100% crossbrowser way to tell if image is in the cache", however, you can guess it by measuring time, taken between load starts and ends.
So, you can add a quasi-preloader script on page, that tries to preload all of your hidden images, and, if the average time taken exceeds some threshold (say 100ms or something like that) per image - it starts your main preloader, so you avoid unnecessary page blocking / data loss on main preloader start.
As long as all your images are in cache - the quasi-preloader won't take too much resources to check all the images.
Some very "dirty" example of what should it look like:
window.needsPreloadingFlag=false;
window.onfocus = function () {
if (!window.needsPreloadingFlag) {
needsPreloadingFlag = quasiPreloader.checkHiddenImages();
}
};
yourApp.onUserActionThatProbablyNeedsPreloading = function (){
//that should be bound to events that require your hidden images
if (window.needsPreloadingFlag) {
yourApp.preloadImages();
}
//...
}

Related

Trigger load of elements without scrolling

This is going to sound pretty generic but I don't know how else to ask this question.
I am using Selenium Webdriver (on python) to automate the download of some images. I can access a list of links to those images after a search. The full list of links is of known length, but gets loaded after successive scroll downs.
For example, after launching the search, I may know there are 210 results, but only 20 of them get loaded. Scrolling down multiple times will load the other ones. Workflow example:
Insert search details, click search button
Read total number of elements
Scroll down until the number of displayed elements is the same as the total
Download stuff
Close download page, go back to point 3.
The scroll down is really painful because the image download is on a different page, and closing the new page will reload the original search results without all the scroll-downs. Meaning I need to scroll again.
In order to speed up this process, I tried using PhantomJS and upping the vertical resolution of the page. However, this does NOT load all elements and the process fails, probably because the vertical resolution is enough to load all 20 elements and cannot trigger the scroll-function to load the others.
So I'm guessing there is some function that gets triggered when scrolling down. But I can't find it in the page source.
What I know is that it loads a <div id="loader"> that I see every time new items are getting fetched (and every time the page hangs because of connection problems).
My question n.1 is whether calling the function directly without scrolling, (i.e. d.execute_script("some_magic_function") would save some time compared to d.execute_script("window.scrollTo(0, 1000);")). Perhaps not, so I should just use the actual scrolling behavior as I'm doing now.
But if there could be an advantage, then my question n.2 is how I can find the function that triggered the loading of new elements (and of the <div id="loader">). I've tried looking for onscroll or just scroll but got nowhere.

Gif hides before fully played

I have a full-screen .gif animation that starts when the user accesses the home page and then fadeOut reviewing the page's content. The thing is, depending on the computer and its internet, there is a delay and sometimes the animation end up hiding before it has been fully viewed.
I am using the code below to hide the div that allocates the animation based on the duration it has (around 10s). I don't know if it is possible, but I would like to hide it after it as been fully played/load(not sure) and not after a specific amount of time.
$(".animation").delay(9500).fadeOut(400);
Try placing the code in a separate JS file (or update the current file and test it) and then...rather than using the $(document).ready, use the $(window).load
The window load event will execute after the page is fully loaded, including all the frames, objects, images, etc.

html, make page appear ready (no tab spinning wheel) while just waiting for images

I have a page that loads images from various sources. Occasionally these images fail to load; perhaps the link has gone dead or whatever. That's fine.
What bothers me is that the browser might take 6 seconds or even longer (I've seen 20 seconds) before it decides that the image has failed to load. During this time the spinning loading wheel in the Chrome tab keeps going and going, making it seem like my page isn't ready.
I've switched my javascript loading from onload to $(document).ready() so at least my page isn't inactive while it waits for the images to load. But it might appear as though it is.
But is there some way to make my page appear "ready" (no spinning wheel) when all it's doing is waiting for the image? Maybe another way to load images? I currently use the img element with src. Or a way to make it give up sooner? Does it really need 6 seconds to decide that an image link is dead?
Not sure if this has a solution. It's a problem that I have seen on a lot of websites, not just mine, but it drives me nuts. I'll often click the stop-loading-x just to make it stop! I'd at least like for my own website to not be like that.
According to my tests, the loading indicator in Chrome does not show for image elements where the loading was triggered by Javascript. Thus, if you don't mind the images not loading with javascript disabled, you could send the img with the src unset, and set it when the page loads. You can store the URL in data-src. The upside is then that you can control when the image loads, if you want to (though you may want to use a plugin for that, like Roullie's answer suggests).
<img width=100 height=100 class="async-img" data-src="http://www.example.com/some.png">
...
$(document).ready(function(){
$(".async-img").each(function(){
this.src = $(this).data("src");
})
})
maybe it can help. lazyload.js

Technique for prioritising image download in web browser

I have a web page that contains many thumbnail images (about 100). When you click on one of the thumbnails, a modal popup is created, which is actually a new web page inside an iframe. This new web page contains 1 large image.
The problem occurs when the user opens the popup before all of the 100+ thumbnails have finished downloading on the parent page. The user must now wait a long time before they can see the large image in the popup, because the browser doesn't know to prioritise this new image above the thumbnails it is already trying to retrieve.
Any thoughts on a solution to this problem?
When you load that page, the browser queues up 100 requests for those thumbnails. There's no way I know of to remove items from the request queue. Depending on the browser, it may request up to 6 concurrently (referring to this thread), but they'll still be queued ahead of your modal dialog's large image. What you can do (from that same thread) is host the modal dialog images on a separate subdomain so the browser places them into a separate queue, as if they were on entirely different sites. That new queue would be allowed to run concurrently with your thumbnail requests.
You can use BASE64 Data URI for all the small images.
Your page can became larger but in some installs - whole page load became faster.
The other option - load the large image from other subdomain, as the "queue" is by hostname.
Interesting question. I've never come across such a situation. A workaround that comes to mind would be to load the thumbnail images only when the user is viewing them.
If you are using jQuery, you could try using this plugin:
Lazy Load Plugin for jQuery
One way to resolve this is to combine your small thumbnails into one large tiled image, reducing the number of images on the page.

Smoothly preload largeish background images

I have a site which uses largeish (60-100k) background images which vary from page to page.
When the user loads the page for the fist time, the page content is loaded first and the background image appears a short time after. This is, I understand, intended behavior in browsers but it makes the page loading look quite "bumpy" on slower connections.
I had thought of hiding the page with a loading "mask" which gets removed by JS when the background image has loaded...but this is still quite an ugly approach.
How could I make it so the page content and the background image appear to the user at the same time?
The best solution here would be to try and find a way to get that image smaller. There are some good compression tools out there. I recommend looking at ImageMagick, some JPEG-specific tools (http://jpegclub.org/) or PNG-specific tools (http://www.aboutonlinetips.com/optimize-and-compress-png-files/).
But to do what you're specifically asking - hide everything on the page until it's ready and then have it load in - you could use jQuery and do something like this:
$(function(){
var bgimage = new Image();
bgimage.src="{your giant image's URL goes here}";
$(bgimage).load(function(){
$("body").css("background-image","url("+$(this).attr("src")+")").fadeIn();
});
});
What this does is it waits until all the elements are loaded on the page and then creates a new Image object. We point the source to your larger image file. When that is finished loading, we change the background to use this newly loaded image, which should load instantly because the browser cached it.
I have fadeIn() there in case you want to hide all of the content on the page until it's ready. This means you should make the hidden.
For some reason fadeIn() works better than show() or simply removing a "hidden" class via removeClass(), if you take that approach. With the latter two approaches the tag seems to resize its height to fit the content of the page which can result in not displaying the background image in its entirety.
Honestly though, I don't really recommend this approach :p
At least, not if you're going to hide all the content on the page until it's ready.
This might be a good approach for displaying the background image only when it's ready, avoiding the partially loaded image being displayed...
A slower load is just the tradeoff for using large images.
A better way would probably be to use jquery and fade the background image in once it has loaded. Also you could try preloading the next image before the user clicks the next page to make it even smoother.
If you delay the content from showing until the image has shown it's just going to irritate your users. They are (probably) there primarially for the information so don't ever touch anything that delays that process.
The exception for this is some arty farty website where people who don't know about websites come on to click on things and they don't care about anything apart from it looking pretty.
You could use data URIs to mitigate this issue in modern browsers and fall back to your current technique for IE 6/7.

Categories

Resources