How do I optimize my website for slow data connections? - javascript

The solutions here worked fine however they were quite labour intensive. To anyone looking to perform similar enhancements on old asp.net solutions I would highly recommend switching the project to MVC just to take advantage of the script and style bundling. .aspx files work as expected in MVC projects.
I'm about to start work on performing some performance enhancements for one of our products.
Our users connect to the network using radio which is extremely slow. The main bottlenecks in the application are the network and the database. I am going to be focusing on reducing the network footprint of the application.
I am going to start with a few "quick wins" before I get down to the nitty gritty of tearing apart UpdatePanels, removing unnecessary content and whatever else I can think of.
Right now I have a few things that I think I'm ready to implement
These include
Minifying and combine css Using This
Minifying and combine js same as above
Removing excess whitespace from html sent to client. Using this
Edit : The assets minification and white space cleaning tools work quite well together.
However I have a few things that I'm not sure how I'll address.
Some microsoft resources (WebResource.axd?d=blahblah and ScriptResource.axd?d=blahblah) are not minified. This and This and a few others depending on the page. Microsoft.Ajax is fine though. How can I manually minify these files if they aren't being minified automatically? Am I missing a setting somewhere?
Is it possible to combine the microsoft resources into a single js file with my javascript?
401 errors, In fiddler I can see that my first hit to the website always gives a 401 error it is immediately followed by the normal 200. Also other resources will randomly have a 401 on their first call as well. Is this some sort of IIS setting that needs to be configured to remove this unneeded call?
Javascript inside aspx files. Unfortunately we have a lot of js inside our aspx files as well as a lot of javascript that gets rendered using ScriptManager.RegisterStartupScript in our code behinds. How would I go about minifying javascript within <script> tags in the aspx markup?
Favicon, can this be diabled? If not what's the next best thing?
Update
Mads Kristensen's combiner works great. However I've found that there are issues with some pages that include 14+ axd references produce a 404.15 error (query string is too long, ie only bug) My solution for this was to gzip and base64 encode the query string.
I've found that combining my js includes with the .axd files is a fruitless task as the .axd files are different for each page. Having my static js files seperate produces an extra service request but it will remain cached on the client instead of having the client redownload those scripts as a part of the combined js axd file.
I enabled anonymous authentication. No more issues.
No progress.
I've found that putting favicon.ico at the root is necessary. I think this may be just because of the way my application has been designed though.

Merging Microsoft script resources: Check out my ContentGator project which I've used to intercept requests for the WebResource (and other scripts and css) files and merge them together. I haven't updated it in a couple years, so I can't speak to how well it'll work out of the box, you should at least be able to reuse some of the code. I don't think I remember adding minification, but you should be able to add it in pretty easily. I think it also has either hooks into RegisterStartupScript, or an alternative to it, where again you should be able to wire in minification.
Favicon, as far as I know, cannot be disabled, as it is requested by default by the browser. If you really don't want it, you could probably just put up a 1x1 pixel ico so you aren't serving a 404, and subsequent requests will result in a 304. It wouldn't hurt to use a CDN for this and all your other static resources as well.
Additionally, check out http://developer.yahoo.com/yslow/ for other more general web optimization tips.
Other things off the top of my head:
Use sprites for images when possible
Output Caching

1 and 2) Optimize .axd: http://madskristensen.net/post/Optimize-WebResourceaxd-and-ScriptResourceaxd.aspx EDIT dead link Compress Script Resource .zip Google Cache of the article
3) HTTP 401 Unauthorized: You're configured authentication mechanism is doing this. If you have Windows authentication enabled but are not using it...
4) Embedded JS: MS AJAX Minifier
http://www.codeproject.com/Articles/81317/Automatically-compress-embedded-JavaScript-resourc
http://stephenwalther.com/blog/archive/2009/10/16/using-the-new-microsoft-ajax-minifier.aspx
There's not much you can do for JS mixed in with your markup. You could make your own utility to parse it out of the ASPX(s) with RegEx and create a file that contains all of it per page then minify that file and insert the 1 script reference. The regular expressions to capture everything within SCRIPT tags will end up being fairly complex because of corner cases like...
<script type="text/javascript">
document.write("<script>Dynamica, RegEx don't stop here -></script>");
</script>
5) Favicon: you either have a LINK tag on your page(s) that reference it with REL="shortcut icon" or you have a "favicon.ico" file sitting at the root of your web site. If you don't have the LINK tags then the browser will check for the favicon.ico at the root of your website automatically.

You should also consider enabling compression in IIS.
IIS6 Compression
IIS7 Compression

From gtmetrix.com:
Avoid bad requests
Avoid CSS #import
Avoid CSS expressions (deprecated)
Avoid document.write
Combine external CSS
Combine external JavaScript
Combine images using CSS sprites
Defer loading of JavaScript
Defer parsing of JavaScript
Enable gzip compression
Enable Keep-Alive
Inline small CSS
Inline small JavaScript
Leverage browser caching
Leverage proxy caching (deprecated)
Make landing page redirects cacheable
Minify CSS
Minify HTML
Minify JavaScript
Minimize cookie size (deprecated)
Minimize DNS lookups
Minimize redirects
Minimize request size
Optimize images
Optimize the order of styles and scripts
Parallelize downloads across hostnames
Prefer asynchronous resources
Put CSS in the document head
Remove query strings from static resources
Remove unused CSS
Serve resources from a consistent URL
Serve scaled images
Serve static content from a cookieless domain
Specify a cache validator
Specify a character set early
Specify a Vary: Accept-Encoding header
Specify image dimensions
Use efficient CSS selectors
You can use the gtmetrix tool, ySlow, or google's Page Speed to see how all of these impact it, but this gtmetrix tool is generally awesome and combines features for you, as well as doing some auto-generations that give you the improved versions of CSS files, etc.

http://wiki.asp.net/page.aspx/80/aspnet-optimization/
has a great set of resources on the various elements that you can / should tweek to make speedster web apps on asp.net! Njoy :)

I think that website should be optimized for best performance regardless of user connection speed.
Website performance/speed affects user experience which on the other hand affects overall website goal/conversion, so creating fast responsive websites and speeding up existing ones should be one of the primary goals of every web developer/front end engineer etc.
Anyway, these are two great resource to start with and comes from two giants:
http://developer.yahoo.com/performance/rules.html
http://code.google.com/speed/
Best

Have you enabled client-side caching for static resources such as site images and styles? They won't help with first page view but would speed up things a lot in subsequence views.
Favicon cannot be disabled but the request itself can be eliminated in modern browsers by using a data:url. For example this would cause a page to have slashdot's favicon without sending any request:
<link rel="shortcut icon" href="data:image/x-icon;base64,AAABAAEAEBAQAAEABAAoAQAAFgAAACgAAAAQAAAAIAAAAAEABAAAAAAAgAAAAAAAAAAAAAAAEAAAAAAAAAB4eE0AX18OAP///wBeXisAYWETANPTxACrq4cAgYEaAEhJEgBKSiYAkJF3AL29pgBiYjAAVFQQADQ0CgBCQg4AWe7u7u7u7pWe7u7u7u7u6e7u7u7u7u7u7/ZVr/+rz/7v+iIp8CJf/v//UibwIl////8CIj+mz//4iIUiuIiIj/iIjCIgiIiPjd3dsiXd3diN3d1CIq3d2I3d3dYiLd3Y0RERFGZsERHUERERERERRDd3d3d3d3dzXERERERERMUAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA" type="image/x-icon" />
Keep in mind that enabling client-side caching for favicon should save more bytes then embeding it in every page you send.

Related

Loading external Javascript into rails application

I want to use a CDN to load in bootstrap and jquery in an attempt to improve site performance. With performance in mind, which of the following is the best way of doing this:
1. Add in a script tag directly into a html or layout file
<script src="//netdna.bootstrapcdn.com/bootstrap/3.0.2/js/bootstrap.min.js"></script>
2. Dynamically load the content into the middle of the asset pipeline as discussed by Daniel Kehoe here under 'Dynamic Loading'.
As I assume that what ever the link or different repository used for any file other than our code base, will reflect some issue of availability.
Here bootstrap js file will always depends on the speed of netdna
domain server. Server down or failure will affect our performance as
well as reliablity of our system. Such thing will not happen frequently but may be chance.
I will suggest as of my experience, the best way is to keep the same file on our server in compressed form to avoid such future issues and updated that file at regular interval as update release.
Reduce DNS Lookups
According to Yahoo! Developer Network Blog, it takes about 20-120
milliseconds for DNS (Domain Name System) to resolve IP address for a
given hostname or domain name and the browser cannot do anything until
the process is properly completed.
Merge Multiple Javascripts Into One
--> Folks you can combine multiple Javascripts like for example:
http://www.example.com/javascript/prototype.js
http://www.example.com/javascript/builder.js
http://www.example.com/javascript/effects.js
http://www.example.com/javascript/dragdrop.js
http://www.example.com/javascript/slider.js
Into a single file by changing the URL to:
http://www.example.com/javascript/prototype.js,builder.js,effects.js,dragdrop.js,slider.js
Compress Javascript / CSS
There are also some web services that allow you to manually compress your Javascripts and CSS files online. Here are few we come to know:
compressor.ebiene (Javascript, CSS)
javascriptcompressor.com (Javascript)
jscompress.com (Javascript)
CleanCSS (CSS)
CSS Optimizer (CSS)

Why do these files have random strings for filenames?

I've come across sites with CSS and JS filenames like this:
css_pbm0lsQQJ7A7WCCIMgxLho6mI_kBNgznNUWmTWcnfoE.css
What's causing this or why would you do it?
Edit: Some of each answer below could apply to this scenario, but given the sites I've found this on, serving/caching methods seems the most accurate.
Versioning and making sure that correct version of static resources is being served.
If you have a high traffic website and you serve lots of users you will have several layers of caching: CDN, caching headers on files, etc.
Sometimes it can be hard invalidating the caches with the same filename. Server might pass the correct headers, but client might disregard them and still load cached version. Serving different file name prevents that and ensures that you have correct version of css/js and other static resources.
As you can probably tell, no human came up with that name.
Typically it's
the result of combining multiple CSS files into a single file. This is
done for performance reasons (requesting one file is faster than requesting two.)
The name is likely to be the result of a deterministic algorithm on the
input (i.e. a hash), such that if you perform the combination again but haven't changed the CSS, the output will be given the same name.
When the content (CSS) changes, the name of the output file will change.
This is useful because it makes it impossible for a browser to cache
the old version.
It looks like the file was generated, server-side, for minification.
The website you're visiting might have had multiple CSS files (perhaps combined with #import statements) and JS files (jQuery, jQuery UI, jQuery plugins, some custom code, etc) - rather than have the developer manually minify and combine the files the server might do it for them (ASP.NET 4.5 does this, for example). In this case it uses an arbitrary (random? GUID-based?) filename to ensure it doesn't conflict with anything.
It may be the technology used by the website.
i.e. if you use gwt (it's some java compiled in javscript) or something else that preprocess some code and outputs javascript, you will likely to get weird filenames.

Put javascript and css inline in a single minified html file to improve performance?

A typical website consists of one index.html file and a bunch of javascript and css files. To improve the performance of the website, one can:
Minify the javascript and css files, to reduce the file sizes.
Concatenate the javascript files into one file and similar for the css files, to reduce the number of requests to the server. For commonly used (and shared) libraries like jquery it makes sense to leave them external, allowing the browser to cache the library and reuse it in different web applications.
I'm wondering if it makes sense to put the concatenated javascript and css file inline in on single html file, which will reduce the number of requests even further. Will this improve the performance of your site? Or will it work reversed, making it impossible for the browser to cache anything?
Concatinating your CSS and JS files into one file will reduce the number of requests and make it load faster. But as commented, it won't make much sense unless you have a one-page site and the load time of that page is very critical. So you're better off to separate CSS from Javascript in my opinion.
Here's a book where you can learn more about the topic:
High Performance Web Sites
this tools maybe help you.
Turns your web page to a single HTML file with everything inlined - perfect for appcache manifests on mobile devices that you want to reduce those http requests.
https://github.com/remy/inliner
It would cut down on the number of requests but it would also mean no caching of those for use on other pages. Think of defining an external file as also a way to tell the browser "and this section of the site is reusable". You'd be taking that ability away and so the CSS and JS would load. Like jackwanders said it's great if you only have one page.
This is not a good idea for the following reasons:
You will not enjoy the benefit of cache
You will load unneeded resources in all of your pages
You will have a hard time while developing your website because of large files with unrelated code branches
If you work in a team you will have to work with your teammates on the same files always, which means that you will have a lot of merge conflicts.
You can have a single CSS for all your pages and since it will be cached, the subsequent pages will refer it from cache without sending extra request.
However, putting all Javascript files is into one is contextual.
Most probably you might be using libraries like jQuery, and relevant plugins. This 'might' throw conflicting issues between plugins. So, before you try it all at once, try merging few files at once and checking if the error pops or not.

compressing entire webpages (HTML and JS)

I have found some tools like this one that let me create "auto-extracting" javascript for javascript code in a web page, which employ a variety of techniques to minimize transfer size.
I have a webpage which does have a rather large chunk of javascript code in it. But since I haven't gotten around to optimizing the filesize yet I was thinking about doing the same sort of thing with the HTML bits of my website too. On my blog page the PHP script pulls HTML snippets from a large number of text files, and concatenates them into one giant HTML file which is sent out. Chrome tells me that compressing it with gzip would reduce the filesize by two-thirds.
However I did turn off the gzip compression because what was happening was if you downloaded any of my zip archives I hosted via Internet Explorer, it would herp derp neglect to gunzip them so the file downloaded by IE is always corrupted. I guess I can turn gzip back on if I fix this little issue, but for the time being I'd like to try to see if I can make a self-extracting HTML page. Is it possible to have javascript extract a giant HTML string and add the entire chunk as child of the body element? Would that work?
It will be slower to do that, and very error prone. Any Javascript error will cause the entire page to not render, and your SEO will be absolutely destroyed. Stick to regularly rendered HTML: as the browser is downloading / parsing the HTML, it will begin fetching other resources (images, scripts, css) and rendering the layout. Don't be so focused on strictly smallest download size, but rather quickest overall experience.
Make heavy use of the freely available CDNs. There are the big two: Google and Microsoft, that host a variety of scripts like jQuery and Modernizr. Stick to Google where possible, they seem to have a much higher adoption than Microsoft and thusly a higher chance of a warm cache. Past that, use CDNJS for other publically-available libraries -- they have a lot.
Minify your existing Javascript, and enable content compression for static and dynamic pages. Don't force it on, let the browser request it. What version of IE are you seeing corruption on? I haven't seen that be an issue since IE6...
Using the Javascript packers will make your site appear slower at the expense of saving a few more bytes of transfer on your end. Not only does a script have to run, but now you're asking the users' browser to perform an additional (potentially large) extra step before your scripts can run.
If you're trying to download individual files (with the Save-As dialog), you can't use gzip with a content type of 'application/zip'. The actual Zip format is available with PHP, use those libraries instead.
As a quick win Cloudflare has an auto minify feature for HTML, JS and CSS. We've been using it for a little while now with good results. Definitely worth a look.

How to handle javascript & css files across a site?

I have had some thoughts recently on how to handle shared javascript and css files across a web application.
In a current web application that I am working on, I got quite a large number of different javascripts and css files that are placed in an folder on the server. Some of the files are reused, while others are not.
In a production site, it's quite stupid to have a high number of HTTP requests and many kilobytes of unnecessary javascript and redundant css being loaded. The solution to that is of course to create one big bundled file per page that only contains the necessary information, which then is minimized and sent compressed (GZIP) to the client.
There's no worries to create a bundle of javascript files and minimize them manually if you were going to do it once, but since the app is continuously maintained and things do change and develop, it quite soon becomes a headache to do this manually while pushing out new updates that features changes to javascripts and/or css files to production.
What's a good approach to handle this? How do you handle this in your application?
I built a library, Combres, that does exactly that, i.e. minify, combine etc. It also automatically detects changes to both local and remote JS/CSS files and push the latest to the browser. It's free & open-source. Check this article out for an introduction to Combres.
I am dealing with the exact same issue on a site I am launching.
I recently found out about a project named SquishIt (see on GitHub). It is built for the Asp.net framework. If you aren't using asp.net, you can still learn about the principles behind what he's doing here.
SquishIt allows you to create named "bundles" of files and then to render those combined and minified file bundles throughout the site.
CSS files can be categorized and partitioned to logical parts (like common, print, vs.) and then you can use CSS's import feature to successfully load the CSS files. Reusing of these small files also makes it possible to use client side caching.
When it comes to Javascript, i think you can solve this problem at server side, multiple script files added to the page, you can also dynamically generate the script file server side but for client side caching to work, these parts should have different and static addresses.
I wrote an ASP.NET handler some time ago that combines, compresses/minifies, gzips, and caches the raw CSS and Javascript source code files on demand. To bring in three CSS files, for example, it would look like this in the markup...
<link rel="stylesheet" type="text/css"
href="/getcss.axd?files=main;theme2;contact" />
The getcss.axd handler reads in the query string and determines which files it needs to read in and minify (in this case, it would look for files called main.css, theme2.css, and contact.css). When it's done reading in the file and compressing it, it stores the big minified string in server-side cache (RAM) for a few hours. It always looks in cache first so that on subsequent requests it does not have to re-compress.
I love this solution because...
It reduces the number of requests as much as possible
No additional steps are required for deployment
It is very easy to maintain
Only down-side is that all the style/script code will eventually be stored within server memory. But RAM is so cheap nowadays that it is not as big of a deal as it used to be.
Also, one thing worth mentioning, make sure that the query string is not succeptible to any harmful path manipulation (only allow A-Z and 0-9).
What you are talking about is called minification.
There are many libraries and helpers for different platforms and languages to help with this. As you did not post what you are using, I can't really point you towards something more relevant to yourself.
Here is one project on google code - minify.
Here is an example of a .NET Http handler that does all of this on the fly.

Categories

Resources