javascript lazy loading the progressive enhancement way? - javascript

I'm building a website for a gallery owner that has a lot of images per webpage.
Therefore I want to lazy load the images on the webpage, making the initial load
less heavy. However, I would like to implement this in a "progressive enhancement" way.
I've found a lot of lazy loading methods but they all require fiddling with the html code
in such a way that the webpage would be useless with javascript turned off. (eg. the src attribute of the img tags remains unset until the images is lazy loaded).
To implement a lazy loading method progressivly I think one would need the following:
prevent the browser from fetching the images, even though thers are on the page,
but only do this when javascript is on (so on non-javascript browsers, the images still
load as normal). This should be done without altering the html.
save the src attribute in a data-src attribute
sequentually load the images when scrolling down
Of these three steps the first one seems the hardest one. Even this stackoverflow discussion did not provide an answer that doesn't ruin progressive enhancement.
Has anyone got any ideas?

Since none has come up with an answer, I'll post what I found a reasonable solution.
This problem boils down to the following: while we want to prevent the browser from downloading the images when javascript is turned on, we must be sure the images are downloaded
when javascript is turned off or not available.
It is hard to consistently use javascript to stop loading images on a page when they are
in the "normal" format:
<img src="path/to/image.jpg"></img>
To stop the images from downloading we'd have to remove their src attributes, but in order
to do this, the DOM should be loaded already. With the optimisations a lot of browsers have nowadays it is hard to guarantee that the images aren't downloading already.
On top of that, we certainly want to prevent interrupting images that are already downloading,
because this would simply be a waste.
Therefore, I choose to use the following solution:
<img data-src="path/to/image.jpg" class="lazy"></img>
<noscript>
<img src="path/to/image.jpg"></img>
</noscript>
Notice how the images outside of the noscript tag have no src but a data-src attribute instead. This can be used by a lazyloading script to load the images one by one for instance.
Only when javascript is not available, will the images inside the noscript block
be visible, so there's no need to load the .lazy images (and no way to do this, since
javascript is unavailable).
We do need to hide the images though:
<noscript>
<style>
.lazy {
display: none;
}
</style>
</noscript>
Like the img tags inside the noscript block, this style block will only be visible to the browser when javascript is unavailable.
On a related note: I thought I could reduce the html size by not putting a src or data-src attributes on the lazy images at all. This would be nice because it eliminates
the redundant url from the page, saving us some bandwidth.
I thought I could pluck the src attribute out of the noscript block using javascript anyways. However, this is impossible:
javascript has no access to the contents of a noscript block. The above scheme is therefore
the most efficient I could come up with.

Not specifying a src attribute is invalid HTML, which is unfortunately how most lazy image loaders work.
I am working on a lazyloader that uses valid html markup, github link:
https://github.com/tvler/lazy-progressive-enhancement
A lazyloaded image would be declared by wrapping it in a noscript element:
<noscript><img alt="hello!" src="..."></noscript>
and the final outputted html would be
<img alt="hello!" src="...">.
You can view the whole project on github, which deals with batch loading, event hooking & more, but here's the basic functionality at the scope of a single noscript image:
var noscript = document.querySelector('noscript'), img;
(img = document.createElement('div')).innerHTML = noscript.textContent;
noscript.parentElement.replaceChild(img.firstChild, noscript);

Related

Lazy load using javascript without data-src?

I have an HTML page where JavaScript can be added into the header, but nothing else in the HTML can be edited. The JavaScript can be added inline or be an external file - it doesn't really matter.
I've come across a lot of JavaScript libraries that allow you to lazy load images, videos, iframes, etc. However, they all require that you use data-src instead of the normal src for images. Since I can't edit the HTML, that doesn't work for me.
And obviously - using native lazy loading won't work for me in this case since the HTML can't be changed.
Is there any way to lazy load with pure JavaScript and not have to change anything in the HTML?

What is the best method for lazy load?

I need to eliminate image render-blocking and i use this little script for this
<img data-src>
$('img').each(function() {
$(this).attr('src', $(this).data('src'));
});
It is good for all browser or it is better to use this plugin https://plugins.jquery.com/lazyload/ ?
You could add this CSS rule to avoid the broken image icon showing up :
img[src=''],
img:not([src]) {
opacity: 0;
}
This way, images with no src won't show.
Your code should work in all browsers, but you may want to use some of that Lazy Load Plugin features. For example, it is able to load images when they’re really needed (that is, lazy), not when they’re outside of user-visible area. Your code will try to download all images at the same moment, even if no one is needed.
Consider adding support for robots or users that do not have JavaScript enabled:
<noscript>
<img alt="…" src="…"/>
</noscript>
<img alt="…" data-src="…"/>
Actually, browsers by default load images somewhat lazily. An image on top of web page does not suspend rendering of remaining parts. As Image data is fetched from server browsers paint the reserved space for images in parallel to rendering the other elements.
Your code generates an image and loads it without attaching it to DOM. It is not lazy loading, it may be pre-loading in some context.
Lazy loading is: not beginning to download images from server until their reserved space is visible on browsers view-port. It is used in cases like your page is longer than the view port and there are images whose position stays at a lower portion of the page. Until you scroll to that position you don't load the images.
So if you want to use benefits of lazy loading you should choose the plugin option.

How to force browsers prefetch images in first 6 parallel connections

I've found a lot of techniques to preload images both with CSS and JS, but none of them was able to REALLY preload images the way I need, or more specifically in the order I need.
Simply put the browser will preload all images and stuff in one block, but the order in which each image will be downloaded by the browser is totally on his own calculation, mostly (and totally reasonably) the top most elements in the document will be downloaded first.
Unfortunately this is quite true in my tests with <img> elements only; for other elements with something like background images it's not really like that, even if the images are used as background for the <body> element for example.
The only solution this far that worked the way I needed was to place some <img>elements right after the <body>tag and set them with style="display:none". This is working, but rather ugly and terribly rough way imo to achieve this.
I'm also a bit concerned for SEO, since bots will find right at the start of the document some hidden images just for this purpose (I'm mostly preloading images for preloaders effects like "loading.." with a small logo image).
I was quite charmed with a super brilliant solution I saw to preload images with a pseudo element on the body like this body:before and then use multiple background images. While this technique indeed works to preload, it won't effect the loading order priority... very sad it's so perfect! :(
Isn't there really any other way to force to preload stuff with top priority over the rest of the assets without cluttering the top of the document with hidden images?
UPDATE
As I explain in the comments below, I'm not concerned in having the images loaded in a particular order, I want them to be downloaded BEFORE the most part of the assets and other item in the "download" chart, so that the images will render almost instantly inside the browser along with normal CSS layout rendering when the page is accessed.
UPDATE 2
More info: the term "preload" is indeed misguiding, a more close term for what I'm looking for could be "prefetch" but basically it goes like this:
Most browsers download 6 requests in parallel at a time, holding up the rest of the downloads. If what you "really" need is in this top 6, you are lucky, else it in the next 6, or maybe the one after and so on.
What I'm trying to do is find a proper way to tell "hey download this first please" and in particular "this image".
As pointed out by #farkas in the answers below rel="subresource" is indeed a good start but as pointed here, it works mostly as "[..]suggests it’s actually downloaded as a lower priority than stylesheets/scripts and fonts but at an equal or higher priority than images", so it will indeed be loaded first than many other things, but still no proper way to break in those 6 gold top spots.
As you can see, the browser downloaded first the styles, THEN 1 of the images I needed loaded ASAP, then the scripts, leaving out the other 2 images I needed to be downloaded with top priority.
To be noted tho that I've placed all my scripts not in the head, they are all placed at the bottom before the closing </body>, and also (as stated on top of my question) I've marked my images RIGHT AFTER the opening <body> tag, but only dark-pattern.jpg has been downloaded first, the other 2 were postponed
...
</head>
<body>
<img src="dark-pattern.jpg" style="display:none">
<img src="preloader.jpg" style="display:none">
<img src="xs-style.png" style="display:none">
.. rest of code
...
I'd like to know a proper way to say, "please have a spot for my pictures, just download the stiles, the script can come later".
Is it possible to achieve this?
I've found some more detail on this matter in here too, but nothing on my specific request apart rel="subresource"
http://www.stevesouders.com/blog/2008/03/20/roundup-on-parallel-connections/
http://sgdev-blog.blogspot.it/2014/01/maximum-concurrent-connection-to-same.html
http://andydavies.me/blog/2013/10/22/how-the-browser-pre-loader-makes-pages-load-faster/
PS. if all this thing has a specific technical name, please let me know so I finally can give the beast a name, thanks :P
Loading images using CSS or JS, will be always slower then using HTML. This is due to the fact that CSS and JS is loaded and parsed after HTML. Additionally to that, the browser optimizes this by using a speculative preload parser.
So if your images are already in HTML, you won't reorder your img downloads, if you add a preload script for those images.
Therefore you have basically two options:
Load images as soon as possible by adding them top (either with your hidden img or using link rel="subresource" (Chrome only)
Delay loading all other non-crucial assets using a lazyloader
Try this:
var all = ['img1.jpg','img2.jpg'];
function onAllImagesLoaded() { alert('loaded!'); }
function preload() {
var i = new Image();
var src= all.pop();
if (!src) {
onAllImagesLoaded();
return;
}
i.src = src;
i.onload = preload;
}
preload();
There are some issues with cached images in IE, sometimes it may omit onload call for cached resources.
Edit
You can also use nice plugin designed to track all images loading (but it doesn't preserve images order, all images are loaded simultaneously):
http://imagesloaded.desandro.com/

Dynamically and asynchronously loading CSS (by setting the "href" attribute in Javascript)

On our site we load stylesheets dynamically based on whether the display is retina or not. Right now, we are using document.write for each <link href="stylesheet.css"> we insert in the page, with different css files if the display is retina.
However, this hurts performance because it causes the css files to load synchronously, as the browser has no way of parsing the javascript to load the next file before the previous one is finished. I believe we can reduce page load time if we take advantage of modern browsers' capability to look ahead and fetch resources asynchronously - in another words, if we load the CSS files in parallel instead.
My current solution is to create a <link id="link-tag-id" href=""> tag for every stylesheet to be loaded, immediately followed by a script which determines the retina status, then fills in the quotations with the appropriate file, along the lines of:
document.getElementById("link-tag-id").setAttribute("href", "retina-stylesheet.css")
This seems to work fine, and when I examine the network waterfalls in Chrome developer tools, as well as on WebPageTest.org (running Chrome, Firefox, and IE), the stylesheets indeed load in parallel. However, it seems a little hacky. I was wondering if there are any dangers to creating a <link> tag with an empty href attribute, and if so, what are they?
On a broader note, are there any other recommendations on how to load CSS dynamically and asynchronously?
Thanks for your help!
EDIT: I just discovered this works too:
document.getElementById("link-tag-id").href = "retina-stylesheet.css"
You could use media queries inside your stylesheet to determine if the display is a retina display, then load in the required CSS.
http://css-tricks.com/snippets/css/retina-display-media-query/
http://mobile.smashingmagazine.com/2010/07/19/how-to-use-css3-media-queries-to-create-a-mobile-version-of-your-website/

Flashing between page loads

On a website, I'm experiencing a "flash" of white that occurs between page loads. It looks bad because I'm using a background image and when the page loads, the background images flash before it comes onto the screen (take a look for yourself). This issues occurs in chrome and IE but not in firefox.
The site has a way of preloading stuff. Every element on the page is in a div wrapper #website which is initially at display:none, and every image is in a div wrapper #website-images which is also hidden. Then the site (using a jquery plugin) checks to see if all the images in #website-images are done loading, once they are a cookie is set to remember that this user has loaded the images already so it won't go through the preloading process once they go to another page or reload the current one, then a call to $("#website").show() is made to display the webpage.
So what could be causing this flickering between the page loads? Is it my way of preloading images? I've added different doctypes, and changed meta information but NOTHING has worked. I'm really lost here, does anyone have any ideas or insights?
This is happening because the DOMLoaded event is fired enough milliseconds before the page actually renders.
In a nutshell, this means you have to optimise your website's speed. This doesn't mean to make it download faster, but it means to download in the correct order, in a non-blocking way.
Step one: Your markup
1)
It seems there is a lot you can do to optimise your markup. Firstly, the order of stylesheets and JavaScripts can be optimised. To ensure CSS files are downloaded asynchronously, you always have to include external CSS before external JavaScript files. style.css is downloaded after some/all of your JavaScript calls.
There is 1 script block found in the head between an external CSS file and another resource. To allow parallel downloading, move the inline script before the external CSS file, or after the next resource.
2)
Your main JavaScript file is inline within your markup. Not only does this block the page download until the script has finished downloading, but having it before your content is probably causing (or adding to) the white flash.
Loading your script asynchronously in the head is my preferred method. You will then have to trigger your script when the DOM has finished loading, or you can achieve the same result by placing the script at the bottom of the body tag.
Step two: Harness the browser's capabilities
1) Looking at the http headers, there are 28 items being served as separate HTTP calls, that are not being cached on the browser (including the html pages, jpg images, stylesheets and JavaScript files).
These items are explicitly non-cacheable, and this can be easily fixed by editing your webserver's configuration.
2) Enable gzip compression. Most web browsers (yes, even IE) supports gzip decompression, and most (if not all) web servers support compressing using gzip. You could even go overkill and look into SPDY, which is an alternative lighter HTTP protocol (supported in Chrome and Firefox).
Step three: Content serving
There are around 30 individual items being served from your domain. Firstly, consider how you could reduce this number of requests. 30 HTTP requests per page view is a lot. You can combat this using the following methods:
1) Paralleled downloads across multiple hostnames. Browsers currently limit the number of concurrent connections to a single domain. Serving your images from a separate domain (for example, img.bigtim.ca) can allow them to be served in parallel to other content.
2) Combine multiple items into one. Many items that are downloaded are purely style content, such as the logo, menu elements, etc. These can be combined into a single image (downloaded only once), and split using CSS. This is called CSS spriting. Stack Overflow does this: look here.
3) If you cannot reduce the amount of items needing downloading, you could reduce the load on your server (and in turn, the client's browser) by serving static content from a cookieless domain. Stack Overflow does this with all their static content such as images, stylesheets and scripts.
Step four: Optimise your own code
There's only so much that HTTP and browser technology can do to help your website's speed. This last step is down to you.
1) Is there any reason you choose to host jquery yourself? Jquery's download page shows multiple CDNs where you can point to for speedy, cached script downloading.
2) There are currently over 20 unused CSS rules within your stylesheets (that's 36% of your entire CSS file). Have a re-think of what is really needed.
3) The main chunk of JavaScript (at the top of your body tag) seems to be a hack to attempt to speed things up, but is probably not helping anything.
A cookie is being set to specify whether or not the page has faded in yet. Not only are you using JavaScript to perform a transition which can happily be performed by CSS, but more than half of the script is used to define the functionality for reading and writing the cookie.
Seeing things like this: $("body").css ("background-image", "url('images/background.png')"); and $("#website").show (); usually gets me ranting about "separation of concerns", but this answer is long enough now so hopefully you can see that it is bad practice to mix style and functionality in the same code.
Addendum: Looking at the code, there is no need for jquery at all to
perform what you are doing. But then again, there is no need to
perform what you are doing, so you could probably do better without any
JavaScript at all.
Move your javascript to the end of the html just before closing the body tags. Sometimes it helps.
I know this is old thread but here is a hack I tried and works.
The idea is not to display anything while CSS is loaded completely.
in html file:
<body style="display:none">
in your CSS, the last line:
body{display:block !important}
CSS is render-blocking.
Divide you CSS into 2 parts -
Critical CSS
Non-Critical CSS
Make Critical CSS load with the page. It should come embedded within the head tag.
Make Non-critical CSS lazy load via ajax.
This will result in serious performance optimization in your webpage leading to less white-screen time.
Also, you can consider loading your Javascript in async/defer way.

Categories

Resources