When serving JavaScript files, is it safe to gzip it by default - javascript

The question fits in the title. I am not interested in what the spec recommend but what the mix of browsers currently deployed support the best.
Google Docs gzips their JS.
The Google AJAX Libraries API CDN gzips JS.
Yahoo gzips the JS for their YUI files.
The Yahoo home page gzips their JS.
So I think that the answer to my question is yes, it is fine to gzip JS for all browsers. But you'll let me know if you disagree.

If you gzip your .js (or any other content), two problems may arise: 1. gzip increases the latency for uncompressible files (needs time to compress and uncompress) 2. an older browser may not understand the gzipped content. To avoid problem 2, you should examine the Accept-Encoding and User-Agent or other parts of the HTTP request to guess if the browser supports gzip. Modern browsers should not have problems with gzippd content.
An excerpt from http://httpd.apache.org/docs/2.2/mod/mod_deflate.html: At first we probe for a User-Agent string that indicates a Netscape Navigator version of 4.x. These versions cannot handle compression of types other than text/html. The versions 4.06, 4.07 and 4.08 also have problems with decompressing html files. Thus, we completely turn off the deflate filter for them.

No, it's not. Firstly, the browser must declare that they accept gzip encoding as per Supercharging Javascript. On top of that, certain versions of IE6 have broken implementations, which is still an issue if they haven't been patched. More in The Internet Explorer Problem (with gzip encoding).

Related

Is there any comprehensive documentation for what browsers do and don't allow for file: scheme URIs?

If you can avoid it, most people will not use file: URIs because there are just so many quirky rules about what is and isn't allowed, but sometimes it's not up to you, you just have to support loading a HTML file or app from a file: URI.
For those times, is there any comprehensive and up-to-date documentation or guide for what browsers do and don't allow for file: URIs? There is information out there, but it is piecemeal. This question for example explains that you can't load an ES module over a file: URI in Chrome, but you can in Firefox. But what about fetch(), WASM, or other modern web technologies? Maybe they don't work over file:, but do support data:, if you can construct one with the right MIME type?
It would be very useful for those rare times when you just have to support running a web app from a local file: URI, if there was a guide that laid out what works consistently across all browsers, so that if you stick with those few technologies your app will work. Does anyone know of anything like that? I've tried searching the web and MDN, but haven't found anything, but that could be because it seems like most search engines ignore the colon in file:.
Here I found an interesting article. They basicaly said:
Chromium allows HTML pages served from file:// URIs to load images and scripts from the same path, but Legacy Edge (v18) and Internet Explorer are the only browsers that consider all local-PC file:// URIs to be same-origin, allowing such pages to refer to other HTML resources on the local computer. Other browsers treat file origins as unique, blocking DOM interactions between frames from different local files, etc.
Based on this whatwg1 or whatwg2 or w3:
it is UA-specific "is left as an exercise to the reader"
For Chrome:
file:// schemed URIs do not contain a host component; be sure that your UI accounts for this possibility.
For Mozilla:
file: Host-specific file names
So basically, it is a mess as they have to face security issues.
Other resources:
Stack similar question
Chromium discution
More doc

Possible reasons for different SRI results in different browsers?

A client would like to use SRI on all CSS and JS assets on their website, but they ran into a very strange issue with Firefox. Their server is an apache2 instance, serving HTML content. CORS is enabled for the whole virtual host, for any (*) origin. There is no cache or CDN in place.
The two files in question are
company.min.css and
company.min.js
To generate the SRI hashes, initially SRI Hash Generator was used. The output from this has multiple algorithms and looks like this:
<script src="https://example.com/static/company.min.js" integrity="sha256-aKuSpMxn15zqbFa0u0FVA7mAFOSVwIwU4gA3U7AZf5Y= sha384-WDAg+qGBjbEyE52SdQ5UHdTObTY+jYTL63m9Oy2IJcGZR8AFn0t9JNN7qdut6DBk sha512-bxmUFj1FVOVV74+zIYau/HSUcLq9cuneBKoMJyW9dAz//yKFi8eRhyyIezF++2vbGO7cR6Pzm1l9rvnPcrvIrg==" crossorigin="anonymous"></script>
similar for the CSS file. These were inserted in the HTML and the site was tested in a few different browsers, with these result:
works in Chrome (/Canary), Opera, Edge and even IE
doesn't work in Firefox (/Nightly).
Firefox only dislikes the CSS, saying that the SHA512 does not match the resource. It processes the JS file fine for whatever reason.
I confirmed (using OpenSSL) that the hash generated by the above tools is indeed correct, and the fact that it works in almost every browser except Firefox got me thinking.
So I tried to hash the resources using Mozilla's own SRI tool, srihash.org, which is a recommendation by Mozilla from their blog post on SRI.
Now it get's a bit weird.
For the CSS file, srihash.org generates a completely different hash.
For the JS file, the hash is the same for both generators and match my offline hashing with OpenSSL.
But, if I replace the CSS link with the Mozilla-generated one, this is the result:
works in Firefox
doesn't work anywhere else, since the hash mismatches
Question
I suspect this is some problem within Firefox' SRI implementation. The relevant discussion for the implementation is here, but it doesn't give any reasons why the resource is different in Firefox. But I'm not strictly a web developer, so are there any likely (documented) reasons for different SRI hashes in different browsers?
I can't disclose the exact server/resources. This is a general question, so if you have any objective experience or references to authorized sources documenting differences in the SRI implementation, please answer.

Should I minify and concatenate javascript and CSS when using HTTP2/SPDY?

Given the advantages of connection reuse and multiplexing in HTTP2 (and SPDY) and the availability of gzip compression, is the effort of adding a minification and concatenation step into a build process justified?
According to Surma from the Chrome team, on H2 you can and in fact should stop bundling because it's useless and to allow more efficient browser caching:
https://www.youtube.com/watch?v=w--PU4HO9SM (time 1:10)
I think that minifying or obfuscating can still be desirable, depending on your needs.
Testing is the only true means of deciding to minify and/or concatenate when resources are being served via H2/SPDY.
The idea behind HTTP/2 (H2) is to serve small static resources on the stream (a single multiplexing TCP connection). Tests have shown that "most" sites benefit an increase of speed by not concatenating resources (and even not using a CDN). It all depends on the sizes of the resources being served on H2/SPDY. I have seen one site increase speed by 30%+ and others w/o change.
With that inmind, my suggestion is too test by minifing all resources and not concatenating them. I'd also test serving all common resources (not using a CDN - and that as well depends where your clients are).
Resources:
Akamai
Columnist Patrick Stox
HTTP/2 101 (Chrome Dev Summit 2015)
Yes you still need to minify and concatenate js and css files for the following reasons:
script minifying and SPDY compression are not the same. a good minifier knows to take advantage of local scope and replace verbose variable names with short repetitive names that are compression-friendly.
SPDY combines your requests so you don't have to stitch the scripts together. but not all browsers support SPDY
SPDY 2 and 3 are binary incompatible. When a browser supports 2 and the server advertises 3, the connection falls back to HTTP 1.1 over SSL; there's no SPDY benefits at all
loading 10 files through one request still incurs 10 fetches on the server side. combining the files reduces disk I/O.
Your question is comparable to "can I care less about writing efficient code now that the machine can run faster?"
The answer is NO. Don't be lazy. Code properly.

GZip with Mobile Browsers

I'm targetting a couple of web projects at mobile users, and noticed that some of the standard tools (JS libraries, json transfers, xml etc) are quite heavy for mobile data plans.
I'd like to be able to implement gzip'd resources, and probably mod_deflate/mod_gzip to try and reduce the amount of bandwidth used by these devices.
However, I don't know how widespread support for gzipped javascript, gzipped html etc. is on mobile devices, or even if it is common practice to use...? It seems to make sense though.
Is it ok to use as a solid tool for the common mobile devices..? iPhone, android, blackberry, windows mobile/opera..?
Thanks.
I don't think it matters, a browser will request GZipped data if it supports it, so your server will only GZip it if your browsers asks it to.
As far as I know most of them supports it, but if you configure you're server well it will be able to send non-compressed resources if needed.
Another benefit is that you improve caching as some devices like iPhone has limits of 25k for a content to be cached.
So the short answer is: Just Do It
mod_deflate / mod_gzip will check the client's "accept" headers and turn compression on or off accordingly.
Just turn it on in your server, and make sure your js and css resources get compressed as well. You can use Firebug's "Net" tab to check whether compression was applied to the loaded resources.
If compression is mising for certain file types, check out this question for how to turn it on.
Go for it - the gzipped version should be only sent if a browser sends an Accept-Encoding: gzip (and the modules check for this automatically). (see the relevant part of RFC 2616)
(the usual warning applies - some browsers are broken. For example, IE6 advertises gzip-capability but doesn't actually support it properly. For mobile browsers, I haven't encountered such brokenness yet - so far every mobile browser that advertised gzip supported it)

Microsoft CDN for jQuery or Google CDN? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
Does it actually matter which CDN you use to link to your jquery file or any javascript file for that matter. Is one potentially faster than the other? What other factors could play a role in which cdn you decide to use? I know that Microsoft, Yahoo, and Google all have CDN's now.
Update based on comments:
Short version: It doesn't matter much, but it may depend on what they host. They all host different things: Google doesn't host jQuery.Validate, Microsoft did not host jQuery-UI, since 2016 they do!!, Microsoft offers their scripts that would otherwise be served via ScriptResource.axd and an easier integration (e.g. ScriptManager with ASP.Net 4.0).
Important Note: If you're building an intranet application, stay away from the CDN approach. It doesn't matter who's hosting it, unless you're on a very overloaded server internally, no CDN will give you more performance than local 100mb/1GB ethernet will. If you use a CDN for a strictly internal application you're hurting performance. Set your cache expiration headers correctly and ignore CDNs exist in the intranet-only scenario.
The chances of either being blocked seems to be about equal, almost zero. I have worked on contracts where this isn't true, but it seems to be an exception. Also, since the original posting of this answer, the context surrounding it has changed greatly, the Microsoft CDN has made a lot of progress.
The project I'm currently on uses both CDNs which works best for our solution. Several factors play into this. Users with an older browser are still probably making 2 simultaneous requests per domain as recommended by the HTTP specification. This isn't an issue for anyone running anything decently new that supports pipelining (every current browser), but based on another factor we're knocking out this limitation as well, at least as far as the javascript.
Google's CDN we're using for:
jquery.min.js
jquery-ui.min.js.
Microsoft's CDN we're using for:
MicrosoftAjax.js
MicrosoftAjaxWebForms.js (until 4.0 we're not completely removing all UpdatePanels)
jQuery.Validate.min.js
Our server:
Combined.js?v=2.2.0.6190 (Major.Minor.Iteration.Changeset)
Since part of our build process is combining and minifying all custom javascript, we do this via a custom script manager that includes the release or debug (non-minified) versions of these scripts depending on the build. Since Google doesn't host the jQuery validation package, this can be a down-side. MVC is including/using this in their 2.0 release, so you could rely completely on Microsoft's CDN for all your needs, and all of it automatic via the ScriptManager.
The only other argument to be made would be DNS times, there is a cost to this in terms of page load speed. On Average: Simply because it's used more (it's been around longer) ajax.googleapis.com is likely to be returned by DNS sooner than ajax.microsoft.com, simply because the local DNS server was more likely to get a request for it (this is a first user in the area penalty). This is a very minor thing and should only be considered if performance is extremely important, down to the millisecond.
(Yes: I realize this point is contrary to my using both CDNs, but in our case the DNS time is far overshadowed by the wait time on the javascript/blocking that occurs)
Last, if you haven't looked at it, one of the best tools out there is Firebug, and some plug-ins for it: Page Speed and YSlow. If you use a CDN but your pages are requesting images every time because of no cache-headers, you're missing the low-hanging fruit. Firebug's Net panel can quickly give you a quick breakdown of your page load-time, and Page Speed/YSlow can offer some good suggestions to help.
You should absolutely use the Google CDN for jQuery (and this is coming from a Microsoft-centric developer).
It's simple statistics. Those who would consider using the MS CDN for jQuery will always be a minority. There are too many non-MS developers using jQuery who will use Google's and wouldn't consider using Microsoft's. Since one of the big wins with a public CDN is improved caching, splitting usage among multiple CDNs decreases the potential for that benefit.
Google will send you a jQuery version minified with their own software, this version is 6kb lighter than the standard minified version served by MS. Go for Google.
One minor thing to consider is that both companies offer slightly different "extra" libraries:
Microsoft is offering the JQuery validation library on their CDN, whereas Google is not (http://www.asp.net/ajaxlibrary/cdn.ashx)
Google is offering the JQuery UI library on their CDN, whereas Microsoft is not (http://code.google.com/apis/ajaxlibs/documentation/)
Depending on your needs, this may be relevant.
It should also be noted that as ajax.microsoft.com is a sub domain of microsoft.com requests send all microsoft.com cookies adding to the overall time it takes to get the file back.
Also, ajax.microsoft.com is using default IIS7 compression which is inferior to the standard compression that other web servers use.
http://ajax.microsoft.com/ajax/jquery/jquery-1.4.4.min.js - 33.4K
http://ajax.googleapis.com/ajax/libs/jquery/1.4.4/jquery.min.js - 26.5K
Also, as others have mentioned google CDN is way more popular which greatly increases the chance of a file being cached.
So I strongly recommend using google.
It probably doesn't matter, but you could validate this with some A/B testing. Send half of your traffic to one CDN, and half to the other, and set up some profiling to measure the response. I would think it more important to be able to switch easily in case one or the other had some serious unavailability issues.
I know I'm chiming in a little late here, but here is the code that I've been using in production. I've never had issues with it, but your mileage may vary. Make sure you test it in your own environment.
<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min.js" type="text/javascript"></script>
<script type="text/javascript">
!window.jQuery && document.write('<script src="/scripts/jquery-1.4.2.min.js"><\/script>')
</script>
<script src="http://ajax.googleapis.com/ajax/libs/jqueryui/1.8.4/jquery-ui.min.js" type="text/javascript"></script>
<script type="text/javascript">
!window.jQuery.ui && document.write('<script src="/scripts/jquery-ui-1.8.2.min.js"><\/script>')
</script>
Is one potentially faster than the other?
I was actually curious of this myself so I setup a jsbin test page using each of the following and then ran it through webpagetest.org's visual comparison tool. I tested:
ajax.googleapis.com
code.jquery.com
ajax.aspnetcdn.com
cdnjs.cloudflare.com
Who was fastest: code.jquery.com by 0.1 second in both tests
Who was slowest: ajax.aspnetcdn.com by 0.7 seconds in first test and ajax.googleapis.com by 1 second in second test
Here's the 1st test (each was tested 3 times):
Video: http://www.webpagetest.org/video/view.php?id=121019_16c5e25eff2937f63cc1714ed1eac814794e62b3
Reports: http://www.webpagetest.org/video/compare.php?tests=121019_D2_KF0,121019_9Q_KF1,121019_WW_KF2,121019_9K_KF3
Here's the 2nd test (another 3 each):
Video: http://www.webpagetest.org/video/view.php?id=121019_a7b351f706cad2c25664fee7ef349371f17c4e74
Reports: http://www.webpagetest.org/video/compare.php?tests=121019_MP_KJN,121019_S6_KJP,121019_V9_KJQ,121019_VY_KJR
As stated by Pingdom:
When someone visits your site, if they have already visited another
site that uses the same jQuery file on the same CDN, the file will
have been cached and doesn’t need to be downloaded at all. It can’t
get any faster than that.
This means that the most widely used CDN will have the odds on its
side, which can pay off for your site.
A few observations on performance:
Google’s CDN is consistently the slowest of the three both in North America and Europe. In Europe, Microsoft’s CDN is the fastest.
I think it depends on where is your targeted audience. You can use alertra.com to check both CDN speed from many locations around the world.
One additional consideration - if your site is SSL and you need to support Android 2.1 (or earlier), the SSL certificate on the HTTPS version of the Microsoft CDN will crash those versions of the Android browser, per this issue: http://code.google.com/p/android/issues/detail?id=5001. It's not Microsoft's "fault", as the SSL certificate is technically valid and the defect is in Android's SSL implementation... but it will crash your site, nonetheless.
The SSL cert on Google's CDN does not fall afoul of this particular issue (relating to the certificate's "Certificate Subject Alt Name").
So, for SSL + Android 2.1 support, use the Google CDN.
My answer is bit different than others, I will go with microsoft if you need jquery validator which almost everyone need if you are using jquery.
Microsoft CDN http connection is Keep-Alive which is big plus when you are requesting multiple items.
So if you need jquery validation then use Microsoft CDN, even if you need jquery ui use microsoft because google not not keeping keep-alive so every request are on it's own. so mixing in that way is plus. if you are using microsoft only for validator then you are doing seperate connection with google server for each request.
In the summery it says that microsoft is not offering UI, that is not correct (any more). It can be downloadloade at http://www.asp.net/ajaxlibrary/cdn.ashx.
Also consider when using Google CDN that some times people make typos such as ajax.googelapis.com. This could potentially create a really nasty xss (cross site scripting) attack. I have actually tested this out by registering a googlapis.com typo and very quickly found myself serving requests for javascript, maps, css etc.
I emailed Google and asked them to register similar CDN typo URL's but have not heard back. This could be a real reason not to rely on CDN's because there are potentially dangerous attackers awaiting the typo requests and can easily serve back jquery etc with an xss payload.
Thank you
Depending which industry the application targets, you may not want to use a CDN managed by other organisations. It often raises issues regarding to compliance, privacy and confidentiality.
For example, when you include Google Analytics in a secure application, the browser still sends the current URL as the "referer" header. Any identifiers, say a session id or secret token may appear in their logs. For example, if a client IP of 192.0.2.5references https://healthsystem.example/condition/impotence, then well, you can infer information which is considered to be rather private.
Other cases include information of consequence, such as an account number, social security number or session information in the URL. That sort of data should never be in the URL as it can be used outside of the application.
While you may trust Google, Microsoft or Yahoo, your users may not.
For industries like Finance, Legal and Health Care, you may want to establish your own CDN with the help of a vendor (e.g. Akamai) with which you can sign a BAA.
I would advise that you base your usage on the general location of the users you're targeting.
If your site is targeted for general public, then using Google's CDN would be a good choice.
If your site is also targeted at China, then using Microsoft's CDN would be a better choice.
I know from my experience, as Google's servers kept getting blocked by the Chinese government, rendering websites that uses them un-loadable.
*Note that you can of cos create region specific sites, e.g. cn.mysite.com to cater specifically for China, but if you're low on resources and time, its worth a consideration.
Full list of Microsoft CDN here.
http://www.asp.net/ajaxlibrary/cdn.ashx
They have since renamed to ajax.aspnetcdn.com, which reduces the likelihood of blockage by firewall rules.
I would use both!
As the Google Jquery hosting has been around a lot longer, the chances are much higher that people will already have it cached compared to the Microsoft one, so I would have it first.
Personally, I would use something like this -
if (typeof jQuery == 'undefined') {
// jQuery is not loaded
document.write("<scr" + "ipt type=\"text/javascript\" src=\"http://ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js\"></scr" + "ipt>");
}
} else {
// jQuery is loaded
}
(Not sure this 100% works, but I was just going to write the idea and not example - This references the Google hosted Jquery and not the Microsoft one as I couldn't find the link)

Categories

Resources