Im using the Keenthemes Metronic admin template and having real trouble with 1 file. Its a javascript file and for some reason no matter what i try i cannot get it to cache. Its a static file with a size of 3.5mb so loading this each time is killing the scripts.
I have tried adding bits to me htaccess and to the header but this changes nothing.
Has anyone else come across this?
Thanks
As I could see from the comment section, you have already tried minimising the file, but it is still very large. You can control the cache of your own browser, but cannot control the cache of the browsers your users are using. Therefore you will not be able to enforce caching the file. You will therefore need to thoroughly read the file and divide it into separate files. You will end up having a core file, which will be useful everywhere and which will be needed to be downloaded and you will have some other files, which are specific to some features, like a file for register/login, another for handling separate features, like choosing colors and so on.
You will need to load your core JS file everywhere, but your specific features will be needed at specific places. You will not need login features, for instance, if the user has already logged in, so you will be able to include the JS files for separate features where they are needed and nowhere else.
Also, You might want to lazy load your JS files, so you will initially load the core file and when it has been successfully loaded, separately load the other files. Those features will be unable initially when the page is loaded, so the page will need to somehow handle or prevent attempts from users of using a feature before its script was loaded.
It would not hurt to minify all these separate files either. Possibly RequireJS can help you to handle requirements, but you can implement your own feature to handle requirements as well.
Related
I'm trying to figure out a way to automatically generate an ApplicationCache manifest file from all the HTML,CSS,JavaScript and images files used by our website.
We need this because we need to support offline usage of the website. More precisely, offline usage of an ArcGIS API for JavaScript webapp.
We are not using service workers instead of the ApplicationCache because supporting iOS is a critical requirement and service workers are not supported at all on iOS, on any browser.
The idea is that I'll manually call a function after the site is fully loaded that will dynamically create the text to be used for the new manifest. Then manually copy/paste it in the manifest file. So it's something I would only do when something in the site changes and the manifest file needs to be updated.
This tool, ManifestR, is very close: http://westciv.com/tools/manifestR/
but has two issues with it:
1- It does not handle image file URLs found in CSS files properly. For instance if it finds url(../images/myimage.png) it will add the relative link ../images/myimage.png directly in the manifest file instead of adding the non-relative link like www.mysite.com/images/myimage.png.
2- It does not list any of the scripts loaded through dojo.require (AMD modules).
I'm thinking of using similar code to fix these issues and compile the list of files. I already see how to fix #1, but can't figure out how to fix #2.
So, using JavaScript, how can I find the list of all script URLs used by the website, not just those loaded trough tags (found in window.scripts object), but those loaded using AMD modules as well?
Basically I want to compile the same list that Chrome is showing me for the website in the Sources pane.
Ex:
I'm thinking that if this isn't available anywhere, maybe I could create a proxy function to dojo.require that keeps tracks of all files loaded through AMD.
But I wanted to ask here first, maybe I missed a tool of script that already does this? Or maybe my plan isn't good?
Thanks
I've never used ApplicationCache for an ArcGIS API for JavaScript app, but I would recommend that you first serve up a custom Dojo build of your application in order to bundle your code into one or more build layers. If you configure your Dojo build properly (no small feat) you should know the exact scripts that will be required.
Also, I'd suspect that once you figure out how to get the list of scripts, you may have special considerations in order to get the Dojo AMD loader to be able to use the cached files. See: dojo and the offline application cache
Good luck.
Usually, the javascripts of the main page is heavier than other pages. For example, we put jQuery slideshow in the main page which is not used in other pages. Is it necessary to create different s for the main and individual pages to include only in-action javacript files?
Or all javascript files read on the first page will be cached for browsing the website, and in loading an indivitual page, browser will not read the javascript of slideshow?
Another form of this question is: if I put slideshow on each individual page, will the browser load the slideshow javascript file each time, or it will read from its cache (saved on the visitor's computer)?
like florian h says most browsers will cache the content (unless development tools are being used).
if you only use the slideshow javascript on one page I would recommend putting it in a separate file. There is a downside to this, most often the http requests take the longest time with loading a file.
So if you for example have one javascript file of 1mb and you need all the javascript on most pages its better then using 4 smaller files of 250kb each. Because your browser needs to do 4 separate requests.
Ofcourse this maybe is a difference of a couple of milliseconds of performance profit, so you might want to choose to do it in separate files anyway to increase maintainability.
Allmost all browsers will cache the javascript files, so you shouldn't create different versions for sub pages.
But if you have very large JS files it's of course reasonable to only include those that you actually need.
All files are cached in the browser based on the path to the file.
If you include an javascript from one page, the file will be cached and it won't be downloaded again when you surf other pages.
Unless you want it to ;)
Yes, js files will be cached (if not said otherwise).
But, js files must be processed and may include initialization logic that you do not need. Also every script tag that loads external js will block any other "http thread", meaning images, css files... will stop loading untill js file is loaded, otherwise you will have several parallel (at same time) resources loading.
I would have different scripts for different pages.
For your case it might be an issue and it might not be. You should make few test for you case and see whether do you have performance issues. If not than convenience of not having different scripts for different pages might be better.
I know that best practice for including javascript is having all code in a separate .js file and allowing browsers to cache that file.
But when we begin to use many jquery plugins which have their own .js, and our functions depend on them, wouldn't it be better to load dynamically only the js function and the required .js for the current page?
Wouldn't that be faster, in a page, if I only need one function to load dynamically embedding it in html with the script tag instead of loading the whole js with the js plugins?
In other words, aren't there any cases in which there are better practices than keeping our whole javascript code in a separate .js?
It would seem at first glance that this would be a good idea, but in fact it would actually make matters worse. For example, if one page needs plugins 1, 2 and 3, then a file would be build server side with those plugins in it. Now, the browser goes to another page that needs plugins 2 and 4. This would cause another file to be built, this new file would be different from the first one, but it would also contain the code for plugin 2 so the same code ends up getting downloaded twice, bypassing the version that the browser already has.
You are best off leaving the caching to the browser, rather than trying to second-guess it. However, there are options to improve things.
Top of the list is using a CDN. If the plugins you are using are fairly popular ones, then the chances are that they are being hosted with a CDN. If you link to the CDN-hosted plugins, then any visitors who are hitting your site for the first time and who have also happened to have hit another site that's also using the same plugins from the same CDN, the plugins will already be cached.
There are, of course, other things you can to to speed your javascript up. Best practice includes placing all your script include tags as close to the bottom of the document as possible, so as to not hold up page rendering. You should also look into lazy initialization. This involves, for any stuff that needs significant setup to work, attaching a minimalist event handler that when triggered removes itself and sets up the real event handler.
One problem with having separate js files is that will cause more HTTP requests.
Yahoo have a good best practices guide on speeding up your site: http://developer.yahoo.com/performance/rules.html
I believe Google's closure library has something for combining javascript files and dependencies, but I havn't looked to much into it yet. So don't quote me on it: http://code.google.com/closure/library/docs/calcdeps.html
Also there is a tool called jingo http://code.google.com/p/jingo/ but again, I havn't used it yet.
I keep separate files for each plug-in and page during development, but during production I merge-and-minify all my JavaScript files into a single JS file loaded uniformly throughout the site. My main layout file in my web framework (Sinatra) uses the deployment mode to automatically either generate script tags for all JS files (in order, based on a manifest file) or perform the minification and include a single querystring-timestamped script inclusion.
Every page is given a body tag with a unique id, e.g. <body id="contact">.
For those scripts that need to be specific to a particular page, I either modify the selectors to be prefixed by the body:
$('body#contact form#contact').submit(...);
or (more typically) I have the onload handlers for that page bail early:
jQuery(function($){
if (!$('body#contact').length) return;
// Do things specific to the contact page here.
});
Yes, including code (or even a plug-in) that may only be needed by one page of the site is inefficient if the user never visits that page. On the other hand, after the initial load the entire site's JS is ready to roll from the cache.
The network latency is the main problem.You can get a very responsive page if you reduce the http calls to one.
It means all the JS, CSS are bundled into the HTML page.And if your can forget IE6/7 you can put the images as data:image/png;base64
When we release a new version of our web app, a shell script minify and bundle everything into a single html page.
Then there is a second call for the data, and we render all the HTML client-side using a JS template library: PURE
Ensure the page is cached and gzipped. There is probably a limit in size to consider.We try to stay under 400kb unzipped, and load secondary resources later when needed.
You can also try a service like http://www.blaze.io. It automatically peforms most front end optimization tactics and also couples in a CDN.
There currently in private beta but its worth submitting your website to.
I would recommend you join common bits of functionality into individual javascript module files and load them only in the pages they are being used using RequireJS / head.js or a similar dependency management tool.
An example where you are using lighbox popups, contact forms, tracking, and image sliders in different parts of the website would be to separate these into 4 modules and load them only where needed. That way you optimize caching and make sure your site has no unnecessary flab.
As a general rule its always best to have less files than more, its also important to work on the timing of each JS file, as some are needed BEFORE the page completes loading and some AFTER (ie, when user clicks something)
See a lot more tips in the article: 25 Techniques for Javascript Performance Optimization.
Including a section on managing Javascript file dependencies.
Cheers, hope this is useful.
I am at the point where I have a bunch of javascript files and I'm not sure how to approach caching them all in one file. I have come across using:
javascript_include_tag ... :cache => true
but I have a number of javascript files that are particular to a specific page...does it make sense to include all of them in my layout even though some pages do not need a lot of the javascript in there? Some of my pages do not require any javascript at all, is a browser going to download this concatenated js for every page?
Some people will dump all their JavaScript into one file, but I don't think that makes a lot of sense unless the routines are used in every page.
Think about how your scripts are used. Put ones that are used most often in the most pages in one file. Then, if there are scripts used occasionally, put them in separate files. Then use multiple <script> statements in your HTML file to pull in the ones you need.
If a user's browser is set normally, it will download the scripts once then reference them from its local cache. The first time they request the page it'll take a bit longer to retrieve everything because it has to populate the cache, but from then it'll be fast(er). The browser will use the cached version for all references to the script.
The :cache => true flag can help if you have a bajillion scripts, because they could be compressed during the first download of the file but I don't think it speeds up loading afterwards when the browser is pulling them from its cache.
Caching multiple javascripts into one talks about it.
n include tags = n get request on the server. This does not perform well and the web page gets slower.
I would not mind minifying everything into one file. Its a one time download anyways and then it gets cached in the browser.
Each situation is different so analyze yours using Yslow and see if minifying into one file is going to help or not. Also look at https://github.com/thumblemonks/smurf for minifying your js & css in to 2 files.
A simple question that I'm not sure if it has a short answer!
Description
I have a files of JavaScript that to be loaded in a website here are some notes about them:
They are all comes from the same domain (no cross domain loading needed)
They are identical around the website.
There are several files, like jQuery, and 5 other plugins plus my own application script that is based on them.
Their size all compressed = 224KB, ( I combine all the files in one file then I compress them at once using YUI Compressor 2
Problem
I've heard that 224KB is not ideal to be in one file! and it should be split into several files with maximum of 44KB each .. I can't recall when I've heard this and I'm not sure if it's effective to split it into more files, but It's true that 224KB takes long time to load for the first time, consider that the website is loaded with images and css of course.
I've minimized the need for the early loading of JavaScript file and put it on the bottom, so far this is a good progress but I need to load it assynchounosly with the HTML to gain time Source and the decission to make is:
Yes or No?
Keep it in one compressed big file? or to split them into many compressed file and loaded a asynchronously (I'm aware of handling the dependency related problems)?
It depends on what the site is and how important first load time is for it.
Regardless of that though, I'd probably load JQuery and stuff like that from a public CDN. One big benefit is that it might already be cached even if they have never been to your site.
http://encosia.com/2008/12/10/3-reasons-why-you-should-let-google-host-jquery-for-you/
The Cappuccino team is a big proponent of one file -- they make a javascript framework. Apps made with their tool are expected to have some load time.
http://cappuccino.org/discuss/2009/11/11/just-one-file-with-cappuccino-0-8/
Another benefit of loading JQuery and related from a public CDN would the increased requests by destination. I believe the client is restricted to 2 requests per domain, so by loading jquery from google, and a plugin from jquery, and your custom app code from your own domain, the browser can execute these concurrently rather than waiting for the first two and then issuing a third request.
I guess this adds another performance improvement over one large file as well. Even if you just split that 1 file into 2, it could be retrieved with 2 concurrent requests from the browser potentially improving load time.
Here's what we did to make our web app fast.
The main JS and CSS files are compressed and put inline with the HTML markup.
The white spaces of the HTML are removed and the images are converted to data:image/png by a shell script.
The size is ~400kb but cached and gzipped.
The mobile version of the web app is the same but at ~250kb.
It means the whole app is ready to run, like an executable, in a single http call.
Then a second http call get the data(JSON), and we use PURE to render it in HTML using the existing markups in the page as templates.
The app is divided in modules, only the common modules are preloaded this way.The others are coming when requested by the user.
There is no exact answer to this question. It pretty much depends on how and when you are making use of those files.
Typically, you only want to download JS files on pageload which are universally required by the web app. Module specific or page specific JS files shouldn't be compressed in the main JS download and would ideally be loaded on demand.
Also, this question is valid only if you are concerned about user experience for first time users. The JS files would be cached anyways for every other visit.