Recently I tested my site for mobile friendliness, mobile speed, and desktop speed.I got shocked with the results of my desktop and mobile speed which are too poor like 48/100 and 40/100 respectively, with the error,
Eliminate render-blocking JavaScript and CSS in above-the-fold content
and then i removed the unwanted content getting loaded in my page and also added a defer at the end of my tag,then with this changes the error got suppressed for Desktop and my score increased to 82/100 for desktop and 68/100 for mobile.
That's fine up to now but the problem is the same error still remains with my mobile speed,
Eliminate render-blocking JavaScript and CSS in above-the-fold content
I dont know why error still remains with mobile speed when it was fixed with desktop.Can any one please help me with a suggestion.Thanks in advance.
When you put the CSS files in your heading <head> section, then browser can't load other resources until they did not get (render) css files from your server.
Google already said on their article, if you have small css files then put it internally into head section, so browser don't need to request another HTTP GET request to your server and it will use CSS directly from your <style>.yourcsscode{}</style> tag.. And if your css file is big then use javascript to render it asynchronously (Example already included in above link).
For javascript use asyn tag like this <script async src="my.js"> and if your javascript is small then use it internally so you can save another HTTP GET request.
The whole thing here is, your browser need to wait to load other resources untill they did not get css and javascript files from server. Rahul provided too many points which does not solve this problem but it is useful if you consider to optimize your pagespeed.
don't include you js file in between <head> tag instead include them at end of body tag
I recommend you to compress and minified all you js in one file and css in one file to reduced http request and add expiry header using .htaccess
and this might help you https://varvy.com/pagespeed/render-blocking-css.html
additionally I suggest you to refer this link https://developer.yahoo.com/performance/rules.html
Make Fewer HTTP Requests
Use a Content Delivery Network (CDN)
Add Expires or Cache-Control Header
Gzip Components
Put Stylesheets at Top
Put Scripts at Bottom
Avoid CSS Expressions
Make JavaScript and CSS External
Reduce DNS Lookups
Minify JavaScript and CSS
Avoid Redirects
Remove Duplicate Scripts
Configure ETags
Make Ajax Cacheable
Flush Buffer Early
Use GET for Ajax Requests
Postload Components
Preload Components
Reduce the Number of DOM Elements
Split Components Across Domains
Minimize Number of iframes
Avoid 404s
Reduce Cookie Size
Use Cookie-Free Domains for Components
etc
Related
I sent my project to my server but no one cant see changes what i did in local mode(i have index.html and other js and php). I had the same problem with another project with index.php but soved adding this <?php time();?> at the end of scrip. Is there any similar solution for javascript?
This is what i did
<script src="assets/js/funciones.js?<?php time();?>"></script>
The problem is that you're changing static files, but not their filenames.
By default apache/nginx/etc serve static content with headers that say "cache this for a very long time" because it's static content, why would you not?
Tacking on random trash to the URL like you're doing with you JS is a kludge that permanently breaks all caching and ensures that users will repeatedly download the exact same static file every time they request a page. You can make the trash less random to break the cache less frequently, but it's still an inefficient kludge. [Albeit a popular one, to my immense annoyance.]
Ideally for resource bundles like JS and CSS, you make a new resource bundle file every time you change it, eg: somefile-v1234.js or somefile-20211007.js and update the reference in your HTML source. This has the side-benefit of ensuring that the versions of your resource bundles always match.
The same goes for any other static file: images, CSV, etc.
The trouble you're having now is that you've updated some HTML pages and the only way to break the cache is to have the user perform an action, like hitting CTRL+F5 to force a refresh.
There are a couple ways around this:
Change the Apache/Nginx/etc settings to set shorter expiries for static file cache headers. You may be able to target specific files like index.html, but YMMV.
Serve the content with PHP. Anything served via a PHP script will not have any cache headers set by default, as the assumption is that the output of a script is dynamic. You can also issue the caching headers yourself in PHP to control what gets cached for how long.
Lastly, you cannot solve this problem retroactively. If a user has a cached version of the HTML page that has not yet reached its expiration, the user MUST take action to break that cache. There's nothing that can be done server side because the valid cache tells the client that it doesn't have to ask the server.
Once you get to the point of your application being popular enough to warrant putting a CDN in front of it this problem gets much worse as now there's a cache in the middle that the user doesn't have control of, and it's potentially an expensive problem because some CDN providers charge a fee for forcing CDN cache invalidations.
Here is an image of my recent webpage test:
Is there any way to start as many http requests as early as possible? For example, the google font file requests start very late. Similarly, I want to move the jQuery request to as early as the script.min.js request which is hosted on the domain. Basically, I am looking for any way to make these requests more efficient.
First of all, make sure the request are done straight from HTML whenever possible, and not dynamically via JavaScript. Put your JS and CSS requests into <head> if possible. That way, the preload scanner of the browser will request those files as soon as possible.
Be careful though about where the CSS is placed - see https://csswizardry.com/2018/11/css-and-network-performance/ for more.
If you can't put everything into <head> of the HTML, you're looking for <link rel="preload">, which was created exactly for that purpose; it tells the browser to download some resources, but not execute them yet.
Some literature:
https://developer.mozilla.org/en-US/docs/Web/HTML/Preloading_content
https://developers.google.com/web/updates/2016/03/link-rel-preload
https://medium.com/reloading/preload-prefetch-and-priorities-in-chrome-776165961bbf
https://caniuse.com/#feat=link-rel-preload
Note though that as of late 2018, it's disabled in Firefox, with no ETA when this will ship.
Also, be careful when using preload as the preloaded requests get the highest priority, which sometimes might mean than other critical but not preloaded resources will wrongly get deprioritized.
In some cases (depending on browser, order of entries in HTML etc.), preload can lead to double fetches (particularly for webfonts). Make sure to check for that.
Here's the scenario, not sure what I'm missing.
Page A.htm makes an ajax request for page B.htm, and inserts the response into the page.
Page B.htm contains links to several other JS files, many of which contain a document.ready() function to initialize them.
This works fine when A.htm and B.htm are on the same server but not when they are on different servers.
What I think I'm seeing here, is that when page A and B are on different servers (cross domain ajax), the external resources are being returned asynchronously, or at least out of order, so scripts are executing expecting JQuery.UI to be loaded already, when it is not.
Appreciate any pointers or advice. Apologies for the poor explanation.
You are injecting HTML + script tags via jQuery. In this case *:
HTML content except scripts are injected in the document
Then all scripts are executed one by one
If a script is external then it is downloaded and executed asynchronously
Therefore an external or inline script that depends on jQuery UI might execute before jQuery UI.
One possible solution is to change the way your pages work:
Get rid of external scripts in pageb.html but keep inline scripts
Load the required scripts in pagea.html
Load pageb.html
Another solution is to roll your own jQuery function that will:
Strip all <script src> elements from HTML
Download and execute those scripts in order
Inject the remaining HTML
* The exact behavior is not documented. I had to look into the source code to infer the details.
you are correct in your impression that the issue is a difference in how the requests are handled cross-domain.
Here is a link to get you on the right track : How to make synchronous JSONP crossdomain call
However, you will have to actually re-achitect your solution somewhat to check if the resource has been loaded before moving on. There are many solutions (see the link)
You can set a timer interval and check for something in the dom, or another reasonable solution (despite it's lack of efficiency) is to create a "proxy" serverside (eg php) file on your server and have that file do the cross-domain request, then spit out the result.
Note that since jquery UI is a rather large file, it's conceivable that the cross-domain request finishes first, and executes immediately, even though jqueryUI is not loaded yet. In any case, you're going to have to start thinking about having your app react rather than follow a sequence.
I want to implement caching of the javascript files (Dojo Tool kit) which are not going to change.. Currently my home page takes about 15-17 secs to load and upon refresh it takes 5-6 secs on load.. But is there a way to use the cached files again when we load it in a new browser session.. I do not want the browser to make request to the server on load of the application home page in a new browser session? Also is there a option to set the expiry to a certain number of days.. I tried with META tag and not helping much.. Either I'm doing something wrong or I'm not implementing it correctly..
I have implemented the dojo compression tool kit and see a slight improvement in the performance but not significant..
Usually your browser should do that already anyway. Please check if caching is really turned on and not only for session.
However, creating a custom dojo build with your app profile defining layers with dojo the build puts all your code together and bundles it with dojo.js (the files are still available independently). The result is just one http request for all of the code (larger file but just once). The gained speed due to reduced http requests is much more than a cache could ever provide.
For details refer to the tutorial: http://dojotoolkit.org/documentation/tutorials/1.8/build/
The caching is made by the browser, which behaviour is influenced by Cache-Control HTTP header (see http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html). Normally, the browser would ask, if newer version of resources is available, so you get 1 short request for each resource anyway.
From my experience, even with very aggresive caching, when the browser is instructed not to ask for the version of the resource on server for the given period of time, the checking in the browser cache for such an immense number of resources is a costly process.
The real solution are custom builds. You have written something about "dojo compression", I assume you're acquainted with the Dojo build profiles. It's quite nasty documented, but once you're successfull with it, you should have some big file(s) with layer(s), with the following format:
require({cache:{
"name/of/dojo/resource": function() { ... here comes the content of JS file ... }
, ...
}});
It's a multi-definition file that inlines all definitions of modules within single layer. So, loading such file will load many modules in single request. But you must load the layer.
In order to get layers to run I've had to add the extra require to each of my entry JS files (those that are referenced in the headers of HTML file):
require(["dojo/domReady!"], function(){
// load the layers
require(['dojo/dojo-qm',/*'qmrzsv/qm'*/], function(){
// here is your normal dojo code, all modules will be loaded from the memory cache
require(["dojo", "dojo/on", "dojo/dom-attr", "dojo/dom-class"...],
function(dojo,on,domAttr,domClass...){
....
})
})
})
It has significantly improved the performance. The bottleneck was loading a large amount of little javascript modules, not parsing them. Loading and parsing all modules at once is much cheaper that loading hundreds of them on demand.
I wonder if anyone has found a way to send at mid rendering the head tag so CSS and Javascript are loaded before the page render has finished? Our page takes about 523ms to be rendered and resources aren't loaded until the page is received. I've done a lot of PHP and it is possible to flush the buffer before the end of the script. I've tried to add a Response.flush() at the end of the Masterpage page_load, but the page layout is horribly broken afterward. I've seen a lot of people using an update panel to send the content using AJAX afterward but I don't quite know what impact it would have on the SEO.
If I don't find a solution I guess I'd have to go the reverse proxy route and find a way to invalidate the proxy cache when the pages content change.
Do not place the Flush on code behind but on your html page as:
</head>
<%Response.Flush();%>
<body >
This can make something like fleekering effect on the page, so you can try to move the flush even a little lower to the page.
Also on Yahoo tips page at Flush the Buffer Early
http://developer.yahoo.com/performance/rules.html
Cache on Static
Additionally you can add client cache on static content like the css and javascript. In this page have all the ways for all iis versions.
http://www.iis.net/ConfigReference/system.webServer/staticContent/clientCache
Follow up
One more think that I suggest you to do after I see your pages is to place all css and javascript in one file each. And also use minified to minimize them.
I use this minified http://www.asp.net/ajaxlibrary/Download.ashx with very good results and real time minified.
Consider using a content-delivery-network (CDN) to host your images, CSS and JS files. Browsers have either an eight or four connection limit per domain - so once you use those up the browser has to wait for the resources to be freed up.
By hosting some files on the CDN you get another set of connections to use concurrently, allowing everything to load faster.
Also consider enabling GZIP on your server if you haven't already. This compresses files on the fly, resulting in smaller transfers.
You could use jQuery to execute your js as soon as it is loaded.
$.fn.ready(function(){
//Your code here
})
Or you could just take the standalone ready function -> $(document).ready equivalent without jQuery
You could do a fade-in or show once the document has been loaded. Just set body display:none;