extjs, is it possible to compress load ext-all.js? - javascript

I have a website that's using the extjs librar. Exactly I just need grid, ajax and tree component.
My project is used nationally, and to avoid problems due to low bandwith in the some regions, I must to make it as light as possible.
When I use the developer tools in chrome, my site is too heavy. Especially when loading ext-all.js. It take 3,9 minutes to load(#512kbps), (even when I remove my own images and css from the website).
Is there a way to compress it? Or to just load the tree, grid and ajax components?
I was googling.. and I got this
<script type="text/javascript" src="js/ext-all.js?compression=gzip"></script>
but, it didn't help much.

This is the page that shows you how to build custom versions of ext-js. http://www.sencha.com/learn/Tutorial:Building_Ext_From_Source
They had a link to an online builder that would customize the download but it's been taken down. The link mentioned still points to good resources like JsBuilder, the tool they use to generate ext-all.js and the other packages in the distribution. Just open the ext.jsb to see how it works
You'll need to figure out the dependencies on your own though, good luck!

I'd estimate that at 512k, the extjs load should be around 30 s - 1 minute.
If you're looking at a 4 minute load time, your time is probably spent somewhere other than the download of the library. Are you sure it's the size of your download, or even extjs that is the problem? Could it be that your webserver is under heavy load, or that you're dealing with a latency issue?
As far as reducing the size of the library - there isn't much more you can do. The library is provided in minified format, and stripping it further is not recommended. Zipping it up only means you'll have to unzip at the other end once downloaded, and doesn't buy you that much load time with a library that's already very small.

Related

Split very large javascript file

I'm working on a web project that uses webgl content generated with unity. When trying to load the required js files the browser freezes for around 30 seconds. The main js file has 35MB size unzipped so this seems to be the cause.
I want to avoid this freeze if possible but I couldn't manage to do it using WebWorkers since the script needs access to UI. My other possible solution is to try to split the js file into smaller ones but I don't know how to do it. Do you have any suggestions?
If you add async to your script tag like this <script async src="app.min.js"></script> it will not block rendering anymore. Also caching the script in the browser or delivering it from a CDN can help reduce the download time.
35MB are, however, way too much for a website. Are you sure there isn't a lot of unused stuff like libraries in it?
We recently wrote an article with web performance best practices, with explanations to critical rendering path and other fronted concerns here
35 MB just for the JS file seems ridiculous. It could be that the entire build is probably of that size (textures, media, etc.). Have a look here on how to reduce the build size.
Though 35 MB is wayyyy to much for a JS file, you can start by following pointers:
Create utilities and reuse the code. This can be at any level. Be it generic component (HTML generating code) or validation logic, if it can be configured using arguments, make a function and use it.
If you have Hard-coded JSON in your js, move them to .josn files and load them only when they are required.
Split files based on sections in view. In SPAs, there are cases when a section is not visible. For such cases, don't load such files. Spread your code base from 1 file to 100s of file.
If you have a lot of event listeners, move them to different file. You can have section_event.js, section_data.json, section_utils.js and section_index.js. If there involves lot of data parsing, you can even have section_parser.js
Basic Idea is to split code into multiple files. Then, make code more reusable. You can even look into loading libraries to reduce your load.
Also, load a resource only when required. SPA have stages. Load concerned files when they are needed. Split download from 1 time to partial, on-demand approach. Also look into webpack or grunt or gulp to minify js.

How much time does calling CDN jquery add?

I'm wondering if anyone has any insight into how much time per page load is added by calling out to:
<script src="https://ajax.googleapis.com/ajax/libs/jquery/3.1.0/jquery.min.js"></script>
Specificically, I want to know how much time is added the second time someone visits the same page from adding a call to the google hosted jquery, as opposed to:
Not loading jquery
Loading jquery from a page hosted wherever the html page is located
Loading jquery from a locally stored file (if the html page is loaded form a locally stored page, such as in a chrome extension).
So if you read between the lines of my question, what I really want to know, whether calling out to the CDN jquery is faster or slower than loading the page locally.
I've always heard that the CDN jquery is fast because it is cached. My question is pointing toward trying to understand how this caching works?
Edit in response to downvotes:
I am interesting in this answer, regardless of whether it has any noticable or "practical" significance. I am trying to develop a better mental model of how caching works in this context along with how the browser loads and parses locally hosted javascript.
The second time someone visits your page the file will be cached, so in your scenarios it would be roughly
A couple ms
Same
Same
The nice things about CDNs is that if you use a popular one the user will already have it cached, meaning that their first page load will also be faster. They are also likely to have a sever closer to the user.
TL;DR
Use the CDN.
That really all just depends on how fast the server your html is hosted on is. Yes CDNs are pretty fast. But as you indicated, once the browser has cached the resources (the first time a user visits your page), it will be loading the resource from the cache anyway. In addition, jquery is small. So small in fact that most hosting platforms will yield similar results.
Considering that the file is highly cached by CDN's, the delivery will be consistently fast.
This is the download speed taken from a metro area:
time_namelookup: 0.005
time_connect: 0.042
time_appconnect: 0.203
time_pretransfer: 0.203
time_redirect: 0.000
time_starttransfer: 0.216
----------
time_total: 0.248 milliseconds
size_download: 86351 Bytes
speed_download: 347767.000 B/sec

Angular 2 page size

I am just getting started with Angular 2. It actually makes me think about the size of the page just for the Hello World.
Please look at the scripts which were actually needed and it already is 1.75 MB.
Offcourse with minification it would reduce 30-35% approximately.
Yet it would still be above 1 MB just for this junk Hello World type application. Adding bootstrap CSS / Jquery / Jquery UI at the minimal would take it even further plus add images depending upon the web application type.
Question is 1.75 MB of script without writing a single line of code pertaining to the application.
Is this the new web standard to make the page size on an average above 4-5 MB?
There are several strategies that will reduce the total size of your site.
Enable gzip compression for your assets. Most text files (like JS files) compress very well due to lots of strings that get repeated.
Use the minimised versions of libraries, as you identified.
Use CDN references to 3rd party libraries if possible. That way, the user may already have the file in their cache and don't need to refetch it. Some CDNs also support HTTP/2, meaning that more files can be requested in parallel.
Take advantage of the Ahead-Of-Time compilation (AOT, a.k.a. offline compilation) in Angular 2 RC 5, and swap to using the version of Angular 2 without the compiler. That saves about half of the size of the Angular 2 library file.
Use HTTP/2 yourself for your assets, and refer to each JS file individually, rather than bundling it. That way, if they haven't changed, the user won't need to download them again when they reload. And the first time, all the files can be fetched in parallel.
Use conditional comments or server side processing to remove the shims and other JS files that are only relevant to certain browsers like IE. That way, other browsers don't download those useless scripts.
Use something like Rollup or another tool that can do "tree shaking" to remove unused code.
There are probably other ways to save even more, but this is a good starting point.
Angular2 is going to get smaller in smaller now. See ngconf about this. Its mainly result of tree-shaking minification (loading only code, that really used).
Angular 2 seed project, wich in prod build loads with ~300kb is already available for use.

Multiple files on CDN vs. one file locally

My website uses about 10 third party javascript libraries like jQuery, jQuery UI, prefixfree, a few jQuery plugins and also my own javascript code. Currently I pull the external libraries from CDNs like Google CDN and cloudflare. I was wondering what is a better approach:
Pulling the external libraries from CDNs (like I do today).
Combining all the files to a single js and a single css file and storing them locally.
Any opinions are welcome as long as they are explained.
Thanks :)
The value of a CDN lies in the likelihood of the user having already visited another site calling that same file from that CDN, and becomes increasingly valuable depending on the size of the file. The likelihood of this being the case increases with the ubiquity of the file being requested and the popularity of the CDN.
With this in mind, pulling a relatively large and popular file from a popular CDN makes absolute sense. jQuery, and, to a lesser degree, jQuery UI, fit this bill.
Meanwhile, concatenating files makes sense for smaller files which are not likely to change much — your commonly used plugins will fit this bill, but your core application-specific code probably doesn't: it might change from week to week, and if you're concatenating it with all your other files, you'd have to force the user to download everything all over again.
The HTML5 boilerplate does a pretty good job of providing a generic solution for this:
Modernizr is loaded from local in the head: it's very small and
differs quite a lot from instance to instance, so it doesn't make
sense to source it from a CDN and it won't hurt the user too much to
load it from your server. It's put in the head because CSS may be
making use of it, so you want it's effects to be known before the
body renders. Everything else goes at the bottom, to stop your
heavier scripts blocking rendering while they load and execute.
jQuery from the CDN, since almost everyone uses it and it's quite heavy. The user will probably already have this cached before they
visit your site, in which case they'll load it from cache instantly.
All your smaller 3rd party dependencies and code snippets that aren't likely to change much get concatenating into a plugins.js
file loaded from your own server. This will get cached with a
distant expiry header the first time the user visits and loaded from
cache on subsequent visits.
Your core code goes in main.js, with a closer expiry header to account for the fact that your application logic may change from
week to week or month to month. This way when you've fixe a bug or
introduced new functionality when the user visits a fortnight from
now, this can get loaded fresh while all the content above can be
brought in from cache.
For your other major libraries, you should look at them individually and ask yourself whether they should follow jQuery's lead, be loaded individually from your own server, or get concatenated. An example of how you might come to those decisions:
Angular is incredibly popular, and very large. Get it from the CDN.
Twitter Bootstrap is on a similar level of popularity, but you've got a relatively slim selection of its components, and if the user doesn't already have it it might not be worth getting them to download the full thing. Having said that, the way it fits into the rest of your code is pretty intrinsic, and you're not likely to be changing it without rebuilding the whole site — so you may want to keep it hosted locally but keep it's files separate from your main plugins.js. This way you can always update your plugins.js with Bootstrap extensions without forcing the user to download all of Bootstrap core.
But there's no imperative — your mileage may vary.

How can I optimize my web-page (which is quite large)?

i'm working on a web application...
The application is running fine but the problem is the first time wen i open the application in the browser it shows a blank page i have to hit refresh three or four times to load the page completely and correctly.....
I think my application is too heavy to load, However once it is loaded it's good to go....
i have probably 5 JavaScript files which is around 1.3mb in size and also some UI components.....
is there a possible way to control it so that wen i load the application it returns the entire application without the necessarily hitting refresh again and again....
is there a way to optimize this page....
please help...
Thank you in adavance...
hi again,
is there a way to automatically reload the page if it didn't load the first time?
Check whether you can optimize your code in the javascript. Do you need all the functions that are defined in those 5 javascript files?If not you can split it and load it when other pages load that need this functionality.
Try to find out which part of the code is making it too slow?
1.3 MB of javascript is too much. Try compressing your javascript.
http://jscompress.com/
After compression, try delay loading the javascript files which ever possible:
http://www.websiteoptimization.com/speed/tweak/defer/
Run YSlow addon to gather more information about optimizations possible
http://developer.yahoo.com/yslow/
The easiest method is to run YSlow on a Firefox Console
You should also compress your javascript files using YUI Compressor
Have you minified your javascript. This makes it more difficult for humans to understand but can significantly reduce the file size. If one of those scripts is jQuery you might consider referencing the copy hosted at google on your page rather than having it hosted on the serve. Google's server is probably faster than your, and a lot of users will have a copy of jQuery from google cached.
If the site is image heavy and PNGs are used you might consider removing some data from them to make them smaller using tools like pngcrush
As mentioned by a few other, running the page through YSlow is very likely to help find issues that could cause slow performance

Categories

Resources