I want to use a CDN to load in bootstrap and jquery in an attempt to improve site performance. With performance in mind, which of the following is the best way of doing this:
1. Add in a script tag directly into a html or layout file
<script src="//netdna.bootstrapcdn.com/bootstrap/3.0.2/js/bootstrap.min.js"></script>
2. Dynamically load the content into the middle of the asset pipeline as discussed by Daniel Kehoe here under 'Dynamic Loading'.
As I assume that what ever the link or different repository used for any file other than our code base, will reflect some issue of availability.
Here bootstrap js file will always depends on the speed of netdna
domain server. Server down or failure will affect our performance as
well as reliablity of our system. Such thing will not happen frequently but may be chance.
I will suggest as of my experience, the best way is to keep the same file on our server in compressed form to avoid such future issues and updated that file at regular interval as update release.
Reduce DNS Lookups
According to Yahoo! Developer Network Blog, it takes about 20-120
milliseconds for DNS (Domain Name System) to resolve IP address for a
given hostname or domain name and the browser cannot do anything until
the process is properly completed.
Merge Multiple Javascripts Into One
--> Folks you can combine multiple Javascripts like for example:
http://www.example.com/javascript/prototype.js
http://www.example.com/javascript/builder.js
http://www.example.com/javascript/effects.js
http://www.example.com/javascript/dragdrop.js
http://www.example.com/javascript/slider.js
Into a single file by changing the URL to:
http://www.example.com/javascript/prototype.js,builder.js,effects.js,dragdrop.js,slider.js
Compress Javascript / CSS
There are also some web services that allow you to manually compress your Javascripts and CSS files online. Here are few we come to know:
compressor.ebiene (Javascript, CSS)
javascriptcompressor.com (Javascript)
jscompress.com (Javascript)
CleanCSS (CSS)
CSS Optimizer (CSS)
Related
I've come across sites with CSS and JS filenames like this:
css_pbm0lsQQJ7A7WCCIMgxLho6mI_kBNgznNUWmTWcnfoE.css
What's causing this or why would you do it?
Edit: Some of each answer below could apply to this scenario, but given the sites I've found this on, serving/caching methods seems the most accurate.
Versioning and making sure that correct version of static resources is being served.
If you have a high traffic website and you serve lots of users you will have several layers of caching: CDN, caching headers on files, etc.
Sometimes it can be hard invalidating the caches with the same filename. Server might pass the correct headers, but client might disregard them and still load cached version. Serving different file name prevents that and ensures that you have correct version of css/js and other static resources.
As you can probably tell, no human came up with that name.
Typically it's
the result of combining multiple CSS files into a single file. This is
done for performance reasons (requesting one file is faster than requesting two.)
The name is likely to be the result of a deterministic algorithm on the
input (i.e. a hash), such that if you perform the combination again but haven't changed the CSS, the output will be given the same name.
When the content (CSS) changes, the name of the output file will change.
This is useful because it makes it impossible for a browser to cache
the old version.
It looks like the file was generated, server-side, for minification.
The website you're visiting might have had multiple CSS files (perhaps combined with #import statements) and JS files (jQuery, jQuery UI, jQuery plugins, some custom code, etc) - rather than have the developer manually minify and combine the files the server might do it for them (ASP.NET 4.5 does this, for example). In this case it uses an arbitrary (random? GUID-based?) filename to ensure it doesn't conflict with anything.
It may be the technology used by the website.
i.e. if you use gwt (it's some java compiled in javscript) or something else that preprocess some code and outputs javascript, you will likely to get weird filenames.
The solutions here worked fine however they were quite labour intensive. To anyone looking to perform similar enhancements on old asp.net solutions I would highly recommend switching the project to MVC just to take advantage of the script and style bundling. .aspx files work as expected in MVC projects.
I'm about to start work on performing some performance enhancements for one of our products.
Our users connect to the network using radio which is extremely slow. The main bottlenecks in the application are the network and the database. I am going to be focusing on reducing the network footprint of the application.
I am going to start with a few "quick wins" before I get down to the nitty gritty of tearing apart UpdatePanels, removing unnecessary content and whatever else I can think of.
Right now I have a few things that I think I'm ready to implement
These include
Minifying and combine css Using This
Minifying and combine js same as above
Removing excess whitespace from html sent to client. Using this
Edit : The assets minification and white space cleaning tools work quite well together.
However I have a few things that I'm not sure how I'll address.
Some microsoft resources (WebResource.axd?d=blahblah and ScriptResource.axd?d=blahblah) are not minified. This and This and a few others depending on the page. Microsoft.Ajax is fine though. How can I manually minify these files if they aren't being minified automatically? Am I missing a setting somewhere?
Is it possible to combine the microsoft resources into a single js file with my javascript?
401 errors, In fiddler I can see that my first hit to the website always gives a 401 error it is immediately followed by the normal 200. Also other resources will randomly have a 401 on their first call as well. Is this some sort of IIS setting that needs to be configured to remove this unneeded call?
Javascript inside aspx files. Unfortunately we have a lot of js inside our aspx files as well as a lot of javascript that gets rendered using ScriptManager.RegisterStartupScript in our code behinds. How would I go about minifying javascript within <script> tags in the aspx markup?
Favicon, can this be diabled? If not what's the next best thing?
Update
Mads Kristensen's combiner works great. However I've found that there are issues with some pages that include 14+ axd references produce a 404.15 error (query string is too long, ie only bug) My solution for this was to gzip and base64 encode the query string.
I've found that combining my js includes with the .axd files is a fruitless task as the .axd files are different for each page. Having my static js files seperate produces an extra service request but it will remain cached on the client instead of having the client redownload those scripts as a part of the combined js axd file.
I enabled anonymous authentication. No more issues.
No progress.
I've found that putting favicon.ico at the root is necessary. I think this may be just because of the way my application has been designed though.
Merging Microsoft script resources: Check out my ContentGator project which I've used to intercept requests for the WebResource (and other scripts and css) files and merge them together. I haven't updated it in a couple years, so I can't speak to how well it'll work out of the box, you should at least be able to reuse some of the code. I don't think I remember adding minification, but you should be able to add it in pretty easily. I think it also has either hooks into RegisterStartupScript, or an alternative to it, where again you should be able to wire in minification.
Favicon, as far as I know, cannot be disabled, as it is requested by default by the browser. If you really don't want it, you could probably just put up a 1x1 pixel ico so you aren't serving a 404, and subsequent requests will result in a 304. It wouldn't hurt to use a CDN for this and all your other static resources as well.
Additionally, check out http://developer.yahoo.com/yslow/ for other more general web optimization tips.
Other things off the top of my head:
Use sprites for images when possible
Output Caching
1 and 2) Optimize .axd: http://madskristensen.net/post/Optimize-WebResourceaxd-and-ScriptResourceaxd.aspx EDIT dead link Compress Script Resource .zip Google Cache of the article
3) HTTP 401 Unauthorized: You're configured authentication mechanism is doing this. If you have Windows authentication enabled but are not using it...
4) Embedded JS: MS AJAX Minifier
http://www.codeproject.com/Articles/81317/Automatically-compress-embedded-JavaScript-resourc
http://stephenwalther.com/blog/archive/2009/10/16/using-the-new-microsoft-ajax-minifier.aspx
There's not much you can do for JS mixed in with your markup. You could make your own utility to parse it out of the ASPX(s) with RegEx and create a file that contains all of it per page then minify that file and insert the 1 script reference. The regular expressions to capture everything within SCRIPT tags will end up being fairly complex because of corner cases like...
<script type="text/javascript">
document.write("<script>Dynamica, RegEx don't stop here -></script>");
</script>
5) Favicon: you either have a LINK tag on your page(s) that reference it with REL="shortcut icon" or you have a "favicon.ico" file sitting at the root of your web site. If you don't have the LINK tags then the browser will check for the favicon.ico at the root of your website automatically.
You should also consider enabling compression in IIS.
IIS6 Compression
IIS7 Compression
From gtmetrix.com:
Avoid bad requests
Avoid CSS #import
Avoid CSS expressions (deprecated)
Avoid document.write
Combine external CSS
Combine external JavaScript
Combine images using CSS sprites
Defer loading of JavaScript
Defer parsing of JavaScript
Enable gzip compression
Enable Keep-Alive
Inline small CSS
Inline small JavaScript
Leverage browser caching
Leverage proxy caching (deprecated)
Make landing page redirects cacheable
Minify CSS
Minify HTML
Minify JavaScript
Minimize cookie size (deprecated)
Minimize DNS lookups
Minimize redirects
Minimize request size
Optimize images
Optimize the order of styles and scripts
Parallelize downloads across hostnames
Prefer asynchronous resources
Put CSS in the document head
Remove query strings from static resources
Remove unused CSS
Serve resources from a consistent URL
Serve scaled images
Serve static content from a cookieless domain
Specify a cache validator
Specify a character set early
Specify a Vary: Accept-Encoding header
Specify image dimensions
Use efficient CSS selectors
You can use the gtmetrix tool, ySlow, or google's Page Speed to see how all of these impact it, but this gtmetrix tool is generally awesome and combines features for you, as well as doing some auto-generations that give you the improved versions of CSS files, etc.
http://wiki.asp.net/page.aspx/80/aspnet-optimization/
has a great set of resources on the various elements that you can / should tweek to make speedster web apps on asp.net! Njoy :)
I think that website should be optimized for best performance regardless of user connection speed.
Website performance/speed affects user experience which on the other hand affects overall website goal/conversion, so creating fast responsive websites and speeding up existing ones should be one of the primary goals of every web developer/front end engineer etc.
Anyway, these are two great resource to start with and comes from two giants:
http://developer.yahoo.com/performance/rules.html
http://code.google.com/speed/
Best
Have you enabled client-side caching for static resources such as site images and styles? They won't help with first page view but would speed up things a lot in subsequence views.
Favicon cannot be disabled but the request itself can be eliminated in modern browsers by using a data:url. For example this would cause a page to have slashdot's favicon without sending any request:
<link rel="shortcut icon" href="data:image/x-icon;base64,AAABAAEAEBAQAAEABAAoAQAAFgAAACgAAAAQAAAAIAAAAAEABAAAAAAAgAAAAAAAAAAAAAAAEAAAAAAAAAB4eE0AX18OAP///wBeXisAYWETANPTxACrq4cAgYEaAEhJEgBKSiYAkJF3AL29pgBiYjAAVFQQADQ0CgBCQg4AWe7u7u7u7pWe7u7u7u7u6e7u7u7u7u7u7/ZVr/+rz/7v+iIp8CJf/v//UibwIl////8CIj+mz//4iIUiuIiIj/iIjCIgiIiPjd3dsiXd3diN3d1CIq3d2I3d3dYiLd3Y0RERFGZsERHUERERERERRDd3d3d3d3dzXERERERERMUAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA" type="image/x-icon" />
Keep in mind that enabling client-side caching for favicon should save more bytes then embeding it in every page you send.
I know that best practice for including javascript is having all code in a separate .js file and allowing browsers to cache that file.
But when we begin to use many jquery plugins which have their own .js, and our functions depend on them, wouldn't it be better to load dynamically only the js function and the required .js for the current page?
Wouldn't that be faster, in a page, if I only need one function to load dynamically embedding it in html with the script tag instead of loading the whole js with the js plugins?
In other words, aren't there any cases in which there are better practices than keeping our whole javascript code in a separate .js?
It would seem at first glance that this would be a good idea, but in fact it would actually make matters worse. For example, if one page needs plugins 1, 2 and 3, then a file would be build server side with those plugins in it. Now, the browser goes to another page that needs plugins 2 and 4. This would cause another file to be built, this new file would be different from the first one, but it would also contain the code for plugin 2 so the same code ends up getting downloaded twice, bypassing the version that the browser already has.
You are best off leaving the caching to the browser, rather than trying to second-guess it. However, there are options to improve things.
Top of the list is using a CDN. If the plugins you are using are fairly popular ones, then the chances are that they are being hosted with a CDN. If you link to the CDN-hosted plugins, then any visitors who are hitting your site for the first time and who have also happened to have hit another site that's also using the same plugins from the same CDN, the plugins will already be cached.
There are, of course, other things you can to to speed your javascript up. Best practice includes placing all your script include tags as close to the bottom of the document as possible, so as to not hold up page rendering. You should also look into lazy initialization. This involves, for any stuff that needs significant setup to work, attaching a minimalist event handler that when triggered removes itself and sets up the real event handler.
One problem with having separate js files is that will cause more HTTP requests.
Yahoo have a good best practices guide on speeding up your site: http://developer.yahoo.com/performance/rules.html
I believe Google's closure library has something for combining javascript files and dependencies, but I havn't looked to much into it yet. So don't quote me on it: http://code.google.com/closure/library/docs/calcdeps.html
Also there is a tool called jingo http://code.google.com/p/jingo/ but again, I havn't used it yet.
I keep separate files for each plug-in and page during development, but during production I merge-and-minify all my JavaScript files into a single JS file loaded uniformly throughout the site. My main layout file in my web framework (Sinatra) uses the deployment mode to automatically either generate script tags for all JS files (in order, based on a manifest file) or perform the minification and include a single querystring-timestamped script inclusion.
Every page is given a body tag with a unique id, e.g. <body id="contact">.
For those scripts that need to be specific to a particular page, I either modify the selectors to be prefixed by the body:
$('body#contact form#contact').submit(...);
or (more typically) I have the onload handlers for that page bail early:
jQuery(function($){
if (!$('body#contact').length) return;
// Do things specific to the contact page here.
});
Yes, including code (or even a plug-in) that may only be needed by one page of the site is inefficient if the user never visits that page. On the other hand, after the initial load the entire site's JS is ready to roll from the cache.
The network latency is the main problem.You can get a very responsive page if you reduce the http calls to one.
It means all the JS, CSS are bundled into the HTML page.And if your can forget IE6/7 you can put the images as data:image/png;base64
When we release a new version of our web app, a shell script minify and bundle everything into a single html page.
Then there is a second call for the data, and we render all the HTML client-side using a JS template library: PURE
Ensure the page is cached and gzipped. There is probably a limit in size to consider.We try to stay under 400kb unzipped, and load secondary resources later when needed.
You can also try a service like http://www.blaze.io. It automatically peforms most front end optimization tactics and also couples in a CDN.
There currently in private beta but its worth submitting your website to.
I would recommend you join common bits of functionality into individual javascript module files and load them only in the pages they are being used using RequireJS / head.js or a similar dependency management tool.
An example where you are using lighbox popups, contact forms, tracking, and image sliders in different parts of the website would be to separate these into 4 modules and load them only where needed. That way you optimize caching and make sure your site has no unnecessary flab.
As a general rule its always best to have less files than more, its also important to work on the timing of each JS file, as some are needed BEFORE the page completes loading and some AFTER (ie, when user clicks something)
See a lot more tips in the article: 25 Techniques for Javascript Performance Optimization.
Including a section on managing Javascript file dependencies.
Cheers, hope this is useful.
A simple question that I'm not sure if it has a short answer!
Description
I have a files of JavaScript that to be loaded in a website here are some notes about them:
They are all comes from the same domain (no cross domain loading needed)
They are identical around the website.
There are several files, like jQuery, and 5 other plugins plus my own application script that is based on them.
Their size all compressed = 224KB, ( I combine all the files in one file then I compress them at once using YUI Compressor 2
Problem
I've heard that 224KB is not ideal to be in one file! and it should be split into several files with maximum of 44KB each .. I can't recall when I've heard this and I'm not sure if it's effective to split it into more files, but It's true that 224KB takes long time to load for the first time, consider that the website is loaded with images and css of course.
I've minimized the need for the early loading of JavaScript file and put it on the bottom, so far this is a good progress but I need to load it assynchounosly with the HTML to gain time Source and the decission to make is:
Yes or No?
Keep it in one compressed big file? or to split them into many compressed file and loaded a asynchronously (I'm aware of handling the dependency related problems)?
It depends on what the site is and how important first load time is for it.
Regardless of that though, I'd probably load JQuery and stuff like that from a public CDN. One big benefit is that it might already be cached even if they have never been to your site.
http://encosia.com/2008/12/10/3-reasons-why-you-should-let-google-host-jquery-for-you/
The Cappuccino team is a big proponent of one file -- they make a javascript framework. Apps made with their tool are expected to have some load time.
http://cappuccino.org/discuss/2009/11/11/just-one-file-with-cappuccino-0-8/
Another benefit of loading JQuery and related from a public CDN would the increased requests by destination. I believe the client is restricted to 2 requests per domain, so by loading jquery from google, and a plugin from jquery, and your custom app code from your own domain, the browser can execute these concurrently rather than waiting for the first two and then issuing a third request.
I guess this adds another performance improvement over one large file as well. Even if you just split that 1 file into 2, it could be retrieved with 2 concurrent requests from the browser potentially improving load time.
Here's what we did to make our web app fast.
The main JS and CSS files are compressed and put inline with the HTML markup.
The white spaces of the HTML are removed and the images are converted to data:image/png by a shell script.
The size is ~400kb but cached and gzipped.
The mobile version of the web app is the same but at ~250kb.
It means the whole app is ready to run, like an executable, in a single http call.
Then a second http call get the data(JSON), and we use PURE to render it in HTML using the existing markups in the page as templates.
The app is divided in modules, only the common modules are preloaded this way.The others are coming when requested by the user.
There is no exact answer to this question. It pretty much depends on how and when you are making use of those files.
Typically, you only want to download JS files on pageload which are universally required by the web app. Module specific or page specific JS files shouldn't be compressed in the main JS download and would ideally be loaded on demand.
Also, this question is valid only if you are concerned about user experience for first time users. The JS files would be cached anyways for every other visit.
I have had some thoughts recently on how to handle shared javascript and css files across a web application.
In a current web application that I am working on, I got quite a large number of different javascripts and css files that are placed in an folder on the server. Some of the files are reused, while others are not.
In a production site, it's quite stupid to have a high number of HTTP requests and many kilobytes of unnecessary javascript and redundant css being loaded. The solution to that is of course to create one big bundled file per page that only contains the necessary information, which then is minimized and sent compressed (GZIP) to the client.
There's no worries to create a bundle of javascript files and minimize them manually if you were going to do it once, but since the app is continuously maintained and things do change and develop, it quite soon becomes a headache to do this manually while pushing out new updates that features changes to javascripts and/or css files to production.
What's a good approach to handle this? How do you handle this in your application?
I built a library, Combres, that does exactly that, i.e. minify, combine etc. It also automatically detects changes to both local and remote JS/CSS files and push the latest to the browser. It's free & open-source. Check this article out for an introduction to Combres.
I am dealing with the exact same issue on a site I am launching.
I recently found out about a project named SquishIt (see on GitHub). It is built for the Asp.net framework. If you aren't using asp.net, you can still learn about the principles behind what he's doing here.
SquishIt allows you to create named "bundles" of files and then to render those combined and minified file bundles throughout the site.
CSS files can be categorized and partitioned to logical parts (like common, print, vs.) and then you can use CSS's import feature to successfully load the CSS files. Reusing of these small files also makes it possible to use client side caching.
When it comes to Javascript, i think you can solve this problem at server side, multiple script files added to the page, you can also dynamically generate the script file server side but for client side caching to work, these parts should have different and static addresses.
I wrote an ASP.NET handler some time ago that combines, compresses/minifies, gzips, and caches the raw CSS and Javascript source code files on demand. To bring in three CSS files, for example, it would look like this in the markup...
<link rel="stylesheet" type="text/css"
href="/getcss.axd?files=main;theme2;contact" />
The getcss.axd handler reads in the query string and determines which files it needs to read in and minify (in this case, it would look for files called main.css, theme2.css, and contact.css). When it's done reading in the file and compressing it, it stores the big minified string in server-side cache (RAM) for a few hours. It always looks in cache first so that on subsequent requests it does not have to re-compress.
I love this solution because...
It reduces the number of requests as much as possible
No additional steps are required for deployment
It is very easy to maintain
Only down-side is that all the style/script code will eventually be stored within server memory. But RAM is so cheap nowadays that it is not as big of a deal as it used to be.
Also, one thing worth mentioning, make sure that the query string is not succeptible to any harmful path manipulation (only allow A-Z and 0-9).
What you are talking about is called minification.
There are many libraries and helpers for different platforms and languages to help with this. As you did not post what you are using, I can't really point you towards something more relevant to yourself.
Here is one project on google code - minify.
Here is an example of a .NET Http handler that does all of this on the fly.