GZip with Mobile Browsers - javascript

I'm targetting a couple of web projects at mobile users, and noticed that some of the standard tools (JS libraries, json transfers, xml etc) are quite heavy for mobile data plans.
I'd like to be able to implement gzip'd resources, and probably mod_deflate/mod_gzip to try and reduce the amount of bandwidth used by these devices.
However, I don't know how widespread support for gzipped javascript, gzipped html etc. is on mobile devices, or even if it is common practice to use...? It seems to make sense though.
Is it ok to use as a solid tool for the common mobile devices..? iPhone, android, blackberry, windows mobile/opera..?
Thanks.

I don't think it matters, a browser will request GZipped data if it supports it, so your server will only GZip it if your browsers asks it to.

As far as I know most of them supports it, but if you configure you're server well it will be able to send non-compressed resources if needed.
Another benefit is that you improve caching as some devices like iPhone has limits of 25k for a content to be cached.
So the short answer is: Just Do It

mod_deflate / mod_gzip will check the client's "accept" headers and turn compression on or off accordingly.
Just turn it on in your server, and make sure your js and css resources get compressed as well. You can use Firebug's "Net" tab to check whether compression was applied to the loaded resources.
If compression is mising for certain file types, check out this question for how to turn it on.

Go for it - the gzipped version should be only sent if a browser sends an Accept-Encoding: gzip (and the modules check for this automatically). (see the relevant part of RFC 2616)
(the usual warning applies - some browsers are broken. For example, IE6 advertises gzip-capability but doesn't actually support it properly. For mobile browsers, I haven't encountered such brokenness yet - so far every mobile browser that advertised gzip supported it)

Related

How to multithread a download in client-side javascript

I have very large (50-500GB) files for download, and the single-threaded download offered by the browser engine (edge, chrome, or firefox) is a painfully slow user experience. I had hoped to speed this up by using multithreading to download chunks of the file, but I keep running into browser sandbox issues.
So far the best approach I've found would be to download and stuff all the chunks into localStorage and then download that as a blob, but I'm concerned about the soft limits on storing that much data locally (as well as the performance of that approach when it comes to stitching all the data together).
Ideally, someone has already solved this (and my search skills weren't up to the task of finding it). The only thing I have found have been server-side solutions (which have straightforward file system access). Alternately, I'd like another approach less likely to trip browser security or limit dialogs and more likely to provide the performance my users are seeking.
Many thanks!
One cannot. Browsers intentionally limit the number of connections to a website. To get around this limitation with today’s browsers requires a plugin or other means to escape the browser sandbox.
Worse, because of a lack of direct file system access, the data from multiple downloads has to be cached and then reassembled into the final file, instead of having multiple writers to the same file (and letting the OS cache handle optimization).
TLDR: Although it is possible to have multiple download threads, the maximum is low (4), and the data has to be handled repeatedly. Use a plugin or an actual download program such as FTP or Curl.

Scan and access local file directory in Firefox and IE?

I'm doing some research on whether or not it's possible for a web app (meant to be used and distributed internally) to scan and read files from a local directory (on user machine). I came across a couple of terms as following:
NPAPI: no longer supported by majority of web browser
ActiveX: IE only
Sandbox: Chrome uses this kind of technology, plus it's not fitting to the requirement so I have to look elsewhere
I feel like ActiveX might be the only option even though I haven't actually written any ActiveX control before (not sure if it's possible).
Also the goal is to support more than one kind of web browser, so other than IE I thought Firefox might be capable of achieving the requirement, since no search result so far said otherwise.
Could someone please give me some pointer? I just need to know if it's at all possible to build a ActiveX control or Firefox extension to scan and read files from a local directory. If it is, then what is the downside other than security vulnerability.

Developing a cross-platform self-contained HTML application

I am thinking of building an application, kind of like TiddlyWiki in the sense that everything is self-contained in an HTML file, or at least in a bundle where a user won't have to install anything. It works on just about any browser, and on mobile phones (Android and iPhone), and in some browsers (e.g. Firefox), manages to save to the local filesystem without a plugin (albeit, it launches many security warnings, but there are other solutions for that). Other browsers happen to use a Java plugin to bypass this restriction.
Are there any technologies that exist that make this possible? HTML5's web storage sounds like it would be almost perfect, except that the data would be tied to the browser.
Any assistance would be appreciated (even if that just means editting / retagging the question to get more folks looking).
Whats about the fileapi: http://caniuse.com/#search=fileapi
I am just adding a relevant comment with this but not exactly an answer...
When you are saying that you want to develop application which contains everything... Then I would like to add about Titanium, PhoneGap, and others (Corona)...
This softwares provides JavaScript base which will be running on all the mobiles (if mobile applications), desktops (if desktop applications) and so on.... But Titanium (as i am working on it) works on the SDK of all the other languages for development...
Now TiddlyWiki, what i have understood from the link is that it is creating a web application or something like that which will work on all the other mobile devices. But this is NOT Good always, Since some application needs to be a NATIVE environment (which is supported by Titanium). Native applications will be much more faster than any other developed applications..

How I can check gzip decoding time in the web browser?

I want to check the performance of the gzip decoding speed in a web browser.
In the Java or c#, we can easily check the gzip decoding time.
But I can not measure the decoding time in the web browser.
plz help me.
I want to check some decoding speed of gzipped html files.
With JavaScript can I measure the performance.
In open source browsers like Chromium or Firefox you could have a look at the source code and insert some lines to record the time needed for decoding. If the browser uses a specific library for gzip decoding, you can of course also download that library and test it.
I don't think there's a way to get that time, especially in your own Javascript code : I suppose the decompression is done at a much lower-level (like somewhere arround the download/network layer of the browser) than the rendering of the page the execution of your Javascript code.
That would be the best solution for that compression to be totally transparent for the upper-layers : when rendering pages or executing JS code, there is absolutely no need for the browser to know that it's been received in a compressed form.
Maybe a solution, especially with Firefox, would be to develop some extension ?
Considering Firebug, for example, is able to kind of "hook" into the network layer, to display the information we get in the "Network" tab, I suppose you might be able to do sort of the same ?
In Google Chrome, you can check out the various properties of chrome.loadTimes().
You can do it using Fiddler - it has a time display for each HTTP transaction. Because Fiddler is a debugging proxy, it works with any browser. Windows only, though.

When serving JavaScript files, is it safe to gzip it by default

The question fits in the title. I am not interested in what the spec recommend but what the mix of browsers currently deployed support the best.
Google Docs gzips their JS.
The Google AJAX Libraries API CDN gzips JS.
Yahoo gzips the JS for their YUI files.
The Yahoo home page gzips their JS.
So I think that the answer to my question is yes, it is fine to gzip JS for all browsers. But you'll let me know if you disagree.
If you gzip your .js (or any other content), two problems may arise: 1. gzip increases the latency for uncompressible files (needs time to compress and uncompress) 2. an older browser may not understand the gzipped content. To avoid problem 2, you should examine the Accept-Encoding and User-Agent or other parts of the HTTP request to guess if the browser supports gzip. Modern browsers should not have problems with gzippd content.
An excerpt from http://httpd.apache.org/docs/2.2/mod/mod_deflate.html: At first we probe for a User-Agent string that indicates a Netscape Navigator version of 4.x. These versions cannot handle compression of types other than text/html. The versions 4.06, 4.07 and 4.08 also have problems with decompressing html files. Thus, we completely turn off the deflate filter for them.
No, it's not. Firstly, the browser must declare that they accept gzip encoding as per Supercharging Javascript. On top of that, certain versions of IE6 have broken implementations, which is still an issue if they haven't been patched. More in The Internet Explorer Problem (with gzip encoding).

Categories

Resources