Reflow large numbers of images (600+) smoothly - javascript

I'm working on a webapp that populates a page with a large number of images.
It is similar in layout to an image search, in that the user chooses some settings at the top of the page and the page is instantly filled with images that correspond to her choice.
Sometimes up to 600+ images are generated in one callback. They are img elements with data uris that contain embedded svg.
My question is about performance. The Javascript itself completes very quickly, but once the 600 or so images have been added to the DOM, the browser can take up to FIFTEEN SECONDS to complete all the reflows. The browser freezes for this time, menus become unresponsive, and no change is shown until the reflows are complete. (Chromium version 66)
Is there a way for me to remove this bottleneck and make all the images reflow quicker? e.g a CSS approach that makes the layout trivial? If I made all the images the same size would that help?
If it can't be sped up, is there a way to make reflows happen without blocking user interaction with the page?
Thanks for your help!

SOLUTION: The reason the browser was freezing during all the reflows was because my code was executing synchronously, and therefore tying up the browser.
Making the code asychronous by doing a setTimeout(function, 0) for each image, where function adds the image to the DOM, solved the problem.

Related

How to tell the browser to do the javascript manipulated webpage scaling before any content starts to show up?

I am building a website for me and I care about performance, speed, user experience and a lot more. When I started I didn't know much and I made an initial design using adobe xd. Then using webexport plugin I got a html file that has 2k+ lines of codes for scale to fit any screen and proportional resize of everything. I think I made a 1080p standard artboard but somehow it was way too big like 3554px by 2216px. So the webpage is now that big and scaling is running on it. I don't think it's too big for a browser to handle. Things were good before chrome 97 or 96 update. After that chrome and any browser running on chromium just can't scale my webpage fast enough to fit the screen. I see big divs getting smaller and it takes kinda like 500 ms to settle with decent setup. A lot of variables can affect the performance like if the machine(pc) and network is slow the scaling takes couple of seconds which is too much.
Is there any way to hint the browser to do the scaling first then make the page visible? The older version of chrome was doing it right. I can make the page visible after 600 ms using setTimeout function but doesn't seem what I want.
CSS
will-change: transform;
Tried this but it has bad effect like blurry text on my content. I don't know why it's pushing the browser to it's limit sometimes, maybe I can better optimize my page but these browsers should be more powerful (safari is the worst - can even render the webpage)
This is the home page
https://elomymelo.com/
You should see big page fitting inside the screen and it'll probable take 200 ms pretty fast
proportional resize on window also works
This is a heavy article page
https://elomymelo.com/oneplus%20nord%20review.html
I tried to hide the scaling effect in this page but not satisfied because browsers mess up sometimes. I need the javascript scaling done much faster like visitors will not see that weird transition.

Chrome - background-size: cover;-based gallery causes performance issues

I'm building an app using Vue 2. There's a page that contains a simple image gallery that contains about 20 images and one large page background image.
The background image of the page itself is actually contained in a fixed div element that is position: fixed;, has 100% width and height and uses background-size: cover; to display the image.
All of the ~20 items in the gallery as well are using div elements with background-size: cover; displayed in a 3-column grid and the images are displayed using the dynamically generated background-url CSS property using a Vue computed property. The image paths are never being changed so they aren't being recomputed constantly AFAIK.
The performance of this page in Chrome is abysmal, loading takes forever (those are some high resolution images, though, 4K in width) and once the images are loaded I can somewhat interact with the page but everything is extremely laggy and sometimes the page stops responding completely.
On the other hand, in Firefox and Edge everything is basically buttery smooth, both during the loading of the images and during the scrolling/rendering. The interaction with the app is never blocked.
What I remember trying to fix this is replacing the div elements with regular img tags and loading up images using those. Also, I've tried caching the images, I've tried using static image URLs for testing purposes - the same thing happens every time - other browsers handle it well, Chrome is choking on it.
Here's a screenshot of the performance summary in ~30 seconds, from the moment I click the page URL to the moment that basically everything is loaded and the browser is still struggling to render anything and process any interaction with the page. Obviously the painting phase is an issue here:
Any advice? Thanks!
This question is hard to answer as it is a very specific problem.
You can analyze what is causing this behaviour in chrome by using the developer tools:
Open the developer tools in chrome by pressing "Ctrl+Shift+I".
Go to the tab "Performance".
Hit the record button and do the action which has low performance.
Hit stop and you get a breakdown what happened in the background.
If you don't find the issue with above suggestion, you can post a detailed screenshot of the performance breakdown here, I might be able to help you with it.

Image Loading seems to slow down Javascript execution

I am working on a straightforward web app, purely Javascript.
One of the core functionalities is loading and viewing images.
When a lot of big images are loaded, the script execution often slows down or even halts until some of them are done loading, this is especially noticeable with large .gifs (HTML5 video is not as bad for some reason).
The images are loaded by setting the background-image css attribute of divs with jQuerys .css(), there are no sort of blocking events or sleep / wait time until images are loaded.
Weirdly, on OSX, scrolling (with the Macbook trackpad) temporarily relieves the halt / slowdown, even while in fullscreen (OSX browsers leave wiggle room for the trackpad), which makes me think that it's a problem of rendering or resource allocation of some sort. It feels like the browser does not have the need to redraw, and is only forced to do so because of the scrolling.
I'd like to force it to redraw constantly, 60 FPS.
The issue is with the loading of lot of big images and showing them in the application.
You can try for Image lazy loading concept where images are loaded/fetched as soon as you scroll the window.
May be this link would be helpful. This plugin will take care of loading images when user scrolls and its very easy to use.
Someone can try the concept of WebWorker. Its multithreading in javascript.
I would like to add, to not forget image optimization.

Is it possible to reduce the memory allocated by Google Chrome for my web page?

I am trying to make a website with a minimalist feeling to it, so I put a fullscreen image on the background as background-image of body. I have a transition: background-position 1s set as a CSS rule for body and an easing function, to create a smooth scrolling effect when going to other pages in the same HTML file (I have no actual scrollbar, just navigation elements). The thing I noticed was that once I started scrolling, the memory reserved by the page went from a small 77MB to over 500MB! I tested this in Firefox, but it doesn't seem to happen (either because pages have no separate processes or memory allocation works differently, I imagine). Why does this happen in Google Chrome and not in other browsers? And how can I reduce the enormous amounts of memory reserved by my page?
To give some information on what I am using:
Browser: Google Chrome
RAM: 8GB The page uses javascript with the
following plugins:
jQuery
Bootstrap
Background image dimensions are: 1440 x 540
A few possible causes of the problem:
The image is too big to be rendered with a transition and an easing function.
I should not use background-image for this, but create a new <img/> as a background.
I somehow only checked it with developer tools open in Chrome, increasing the memory allocated.
It's not the image causing the problem, but the web fonts I scroll simultaneously with the background image, also using transition and an easing function.
And I want to add that maybe this is not even a problem after all, it's just that I have never seen a page go over 300MB with memory allocation.
Not sure if this should be posted as a comment instead, however:
Nothing you can do about that. Chrome trades RAM for performance.
For instance each chrome tab runs on its own process for that purpose.
I had cases when a single youtube tab took over 1.5gb memory & virtual memory.
Even a blank tab takes 45mb memory for me.
So there is no issue with your code at all.

GET request made after window.onload is very slow, blocks page scrolling - performance analysis

I have a widget that is inserted on numerous Web pages. It's composed of some JavaScript that loads an HTML document from my server (as JSONP) which is then inserted into a dynamically created <iframe> on the page where the widget is deployed.
I use Clicky for analytics/tracking to measure the number of pageviews that my widget's host page receives. Recently, I needed to go a bit further, to track the number of actual views of the widget itself. The purpose of this data is to more accurately interpret the performance of the widget at generating clickthroughs - that is, if a visitor doesn't scroll down the host page far enough to see my widget in the first place, there's no way that I could have inspired a clickthrough.
To achieve this tracking, I wrote a function that subscribes to the browser's "onscroll" event; basically, each time it's called, it compares the distance between the top of the host page document and the top of the widget, to the distance that the viewport is scrolled down from the top of the host page plus the height of the viewport and half the height of the widget. When the latter exceeds the former, the widget can be assumed to be halfway visible in the browser viewport.
When the function determines that this has happened (the widget needs to remain in the viewport for 2 seconds or more to count), it logs an "action" to Clicky, i.e., informs the analytics software that this has happened. This is done by calling a predefined function that loads an "image" from Clicky's server - basically a way to use a cross-domain GET request to communicate some tracking data.
The problem is that this request takes time - on average, a little over a second - to complete, and during that time, the browser window can't be scrolled. This is a showstopper for me. A slight delay - ideally well under a half second - is acceptable, but nothing approaching a second will work.
I've done my best to analyze the data that various performance tools generate (Firebug's Net Panel, Google Page Speed), but I'm at a loss to explain what is happening.
I would be extremely grateful to anyone who can provide some insight into what is happening, or even better yet, share a possible solution(s) to reduce or eliminate the blocked browser scrolling. The time to fulfill the request is unimportant to me, but the amount of time that the scrollbar is "stuck" is critical. For example, is there a way to make this request to Clicky without interrupting the browser's scrollbar functionality?
As a proof-of-concept of my code, I had created a prototype, viewable here:
http://troy.onespot.com/static/3128/prototype.html
When you scroll the page down until the middle of the gray box enters the viewport for 2 seconds or more, an indicator that a "widget view" has been logged will appear on the top-right of the screen.
(I've only tested this code to work in Firefox 3.0 or more recent versions - in fact, aside from possibly Safari, it's unlikely to work elsewhere, as it doesn't honor cross-browser differences in dimension properties.)
Also, here is a screenshot of Google's Page Speed tool's output during this logging:
http://img.skitch.com/20100121-t6bt1wauaar2drg1xdmwk9g4sb.png
To generate this, I scrolled/jiggled the page constantly as I eased the gray box into the viewport. The function fired by the "onscroll" event can be seen working repeatedly as a broken black line across the top of the output. As you can see, as soon as the Clicky logging happens (the large gap in the broken black line), approximately 1.2 seconds elapse where scrolling is not possible. I have no idea what is happening during the empty span in the latter half of that period, nor do I really understand why the entire period prevents scrolling.
Firebug's Net Panel shows a shorter period of elapsed time (though it still feels like a second or more, subjectively):
http://img.skitch.com/20100121-pwf1ifngffsnqm8qekmm8wp9mt.png
In this case, the vast majority of time (544ms) is spent in the "Blocking" stage, which makes no sense to me; my understanding was that this stage is only encountered when the request is in a queue because the maximum number of requests per hostname are already being made.
Any ideas, suggestions, or other insight would be very much appreciated. Thanks!
Set the timer to 1 and not 0 in the clicky_custom config object. There is a bug in their code that says that if the timer is 0 then wait 500ms.
I found this out using Firebug's profile and their init function was taking a the time. The code was something like timer = config.timer|500. config.timer will evaluate to false so 500 was returned.

Categories

Resources