I have a site using some ajax here: http://deezteez.com/
If you sort by "Newest" (top right drop down box) you will notice that the new images (of products that just got added recently) will take about 30 seconds to actually load in, even though the page is done loading. older images don't do this, even if I start with a clear cache.
Anyone have an idea of what would be causing this?
Chrome's console seems to show that your server is simply slow. The graph below is how your images load in. The light colored bar is when the image is requested. The dark colored bar is the image actually being downloaded.
And you can see they all get requested at the same time. But then it takes a while for the server to respond to those requests. Once the server responds, things seem to download quickly, but that response seems quite lagged.
What is going on behind the scenes on your server, I have no idea. But some suggestions:
Drastically lower product count per page, so that far less images are requested at once.
Use CDN services to speed up static asset delivery and even provide geographically local image download servers.
If you have image data being generated on the fly or pulled form the database on each request, DO NOT DO THAT. Or if you need to do that, use server side caching to prevent doing it over and over again.
Related
Most of my shared hosting accounts have changed theme from X3 to paper_lantern. However, I find the paper_lantern theme takes way longer to load - often 15-to-20 seconds before the page is fully displayed. During this time, the GENERAL INFORMATION side panel will have a loading spinner beside the THEME item, so I suspect that is somehow involved in the delay.
In DevTools, the network tab has a bunch of entries like below image which, when added up, account for almost the entire wait time.
Does anyone know of a way, perhaps using GreaseMonkey/TamperMonkey or AdBlock(?), to abort whatever process is happening here and just get on with displaying the CPANEL panels/icons?
(Browser in use is Google Chrome)
Our web-page contains a 360 angle product viewer based on 36 product images. Only one image is showed at a time, all other are hidden. All images are served from AWS S3 bucket with no cache directive (and that is how it should be). On the first start JS plugin shows preloader for 36 images and
everything works perfect after loading.
Problem comes when the web tab (tested only in chrome) remains open for a long time (several hours) when user works in another tabs. Those hidden images are removing from the cache, and JS script reloads all of them again and 360 drug looks odd (not smooth). After browser loads all absent images it starts to work smooth again and after few hours of inactivity it repeats again.
This behaviour is what we expect, but we want somehow to check if hidden images are not cached anymore to invoke preloader.
I searched the web and stackoverflow for the answer, but all other cache related question are not answering my question, like "check if image is cached after reopening browser or cache". My question is: how to check if hidden image is still in cache?
Example code appreciated. Thanks!
PS - I know how to enable cache headers for images delivered from S3 and that is not an option.
PSS - Creating 1px image holders is not an option too.
As you mentioned in your comments - you don't want to mess with your preloader code and don't want to start your preloader each time.
The answer to your question is "no, there is no 100% crossbrowser way to tell if image is in the cache", however, you can guess it by measuring time, taken between load starts and ends.
So, you can add a quasi-preloader script on page, that tries to preload all of your hidden images, and, if the average time taken exceeds some threshold (say 100ms or something like that) per image - it starts your main preloader, so you avoid unnecessary page blocking / data loss on main preloader start.
As long as all your images are in cache - the quasi-preloader won't take too much resources to check all the images.
Some very "dirty" example of what should it look like:
window.needsPreloadingFlag=false;
window.onfocus = function () {
if (!window.needsPreloadingFlag) {
needsPreloadingFlag = quasiPreloader.checkHiddenImages();
}
};
yourApp.onUserActionThatProbablyNeedsPreloading = function (){
//that should be bound to events that require your hidden images
if (window.needsPreloadingFlag) {
yourApp.preloadImages();
}
//...
}
I have a web page that contains many thumbnail images (about 100). When you click on one of the thumbnails, a modal popup is created, which is actually a new web page inside an iframe. This new web page contains 1 large image.
The problem occurs when the user opens the popup before all of the 100+ thumbnails have finished downloading on the parent page. The user must now wait a long time before they can see the large image in the popup, because the browser doesn't know to prioritise this new image above the thumbnails it is already trying to retrieve.
Any thoughts on a solution to this problem?
When you load that page, the browser queues up 100 requests for those thumbnails. There's no way I know of to remove items from the request queue. Depending on the browser, it may request up to 6 concurrently (referring to this thread), but they'll still be queued ahead of your modal dialog's large image. What you can do (from that same thread) is host the modal dialog images on a separate subdomain so the browser places them into a separate queue, as if they were on entirely different sites. That new queue would be allowed to run concurrently with your thumbnail requests.
You can use BASE64 Data URI for all the small images.
Your page can became larger but in some installs - whole page load became faster.
The other option - load the large image from other subdomain, as the "queue" is by hostname.
Interesting question. I've never come across such a situation. A workaround that comes to mind would be to load the thumbnail images only when the user is viewing them.
If you are using jQuery, you could try using this plugin:
Lazy Load Plugin for jQuery
One way to resolve this is to combine your small thumbnails into one large tiled image, reducing the number of images on the page.
I have been studying JavaScript and I've found so many things what it can do and I feel comfortable using this language, but I'm getting worried about the right click savers out there. Is there a way to prevent people from ever saving the images from my website and put it onto their desktop?
Some girl
Some person
That person took the images
Store it on his/her desktop
Makes fun of the girl
No, there isn't any way to do this that isn't easily circumvented.
You can put some overlay onto the image, but that wont stop people with a dev console for their browser.
Another way is to load images from a script and only allow them to be shown when they are on a certain page (using php or any other server implementation)
No. If someone has gone to your web page and can see your image the browser has already downloaded the image and saved it to the local cache, whether or not the user knows how to get to it.
Also, they can always turn off Javascript in their browser
You can make it hard to download the image but it's IMPOSSIBLE to prevent image theft!
Using a grid of small images and showing just a part of whole image when user zoom in is the way most photography site uses to make it hard to steal the image. When you use grid of images then drag and drop or Save As wouldn't save whole image.
But it's still possible to steal the image by collection all parts of image and connecting them together via an image editing tool
I need to have a way or tools to test the actual perceived rendering time for the browser to render the entire page to users. Any suggestions?
The reason I ask is because firbug and Yslow only reports the DomContentLoaded and OnLoad time.
For instance, my application reports 547ms (onLoad:621ms) for the contents. But the actual content is rendered around 3 seconds. I know so because I actually counted 1, 2, 3 slowly from the moment I hit enter in the url field of the browser to the moment when content appears in front of my eyes. So I know 547ms nor 621ms DOES NOT represents the actual time it takes for the page to load.
Not sure if this helps. But my application
renders json data on the server side, save the data as a javascript variable along with the rest of the page's html before server returns the entire html to browser
page loads Jquery 1.5 and Jquery template
jquery code grabs the json data from the variable defined at step 1
use jquery template to render the page.
Technically, no Ajax involved here and images on the page are all cached. I don't see firebug downloads any of them.
[Edit]
What i'm trying to figure out is after the firebug reported onLoad time which in my case is 621ms, to the time the page is completed and loaded in my eyes (which is at least 3 seconds), what happened to the 2.4s in between? What took place there? Browser is doing something? Something is blocking? Network? what is it?
Google Chrome has excellent auditing built in. Your results will be skewed because it's one of the fastest browsers right now, but it will give you exact measurements of how long it takes for Chrome to render. =)