How to compress javascript in realtime? - javascript

I was wondering if there was a way to compress javascript in realtime, much like gzip works for HTML (and CSS apparently)?
I don't want to have to compress my file manually before upload everytime, I want the server to do it for me without any added work from lazy coders.

gzip works on all text, including JavaScript
If you want to do more compression (e.g. by using YUI compressor before gzipping) then you could get your server to do it, but it would be easier to do it before uploading.
The trick is to automate the build and publish process — so you run a script that does the compression then uploads the result, rather then drag'n'dropping files manually or similar.

I suggest that you write a small script that uploads the file automatically; then you can compress it before the upload.
The other option is to tell your web server to compress files before transfer (but it should do that automatically).
Another option is a cron job (see cron(1) or the Windows scheduler) on the server which checks the file periodically (say once a day) and compresses it when a new version has been uploaded.

On Apache, this compresses everything except images:
<IfModule mod_deflate.c>
# Insert filter
SetOutputFilter DEFLATE
# Netscape 4.x has some problems...
BrowserMatch ^Mozilla/4 gzip-only-text/html
# Netscape 4.06-4.08 have some more problems
BrowserMatch ^Mozilla/4\.0[678] no-gzip
# MSIE masquerades as Netscape, but it is fine
# BrowserMatch \bMSIE !no-gzip !gzip-only-text/html
# NOTE: Due to a bug in mod_setenvif up to Apache 2.0.48
# the above regex won't work. You can use the following
# workaround to get the desired effect:
BrowserMatch \bMSI[E] !no-gzip !gzip-only-text/html
# Don't compress images
SetEnvIfNoCase Request_URI \
\.(?:gif|jpe?g|png)$ no-gzip dont-vary
# Make sure proxies don't deliver the wrong content
Header append Vary User-Agent env=!dont-vary
</IfModule>

If you mean "compress" as in "maintain the code in exactly the same format but push it down the pipe compressed", then enable gzip compression as per the instructions for your server, which you haven't specified in the tags/question.
If you mean "compress" as in "minify the code", then there are options that you can try, like including a minification module/process in the pipeline. I've, very recently, created a solution in ASP.net that uses the Microsoft Ajax Minifier 4.0 to minify javascript files requested, on the fly basically by adding script tags to the page with a src tag that is similar to minifier.ashx?source=my/javascript/file/and/path/here.js and using the Minifier class in AjaxMin.dll to minify the code on demand. This has several advantages over pre-minification:
You can pass a flag into your live site, via a cookie, a querystring value, a given username, to disable minification which makes debugging that little bit easier if you need to resolve the issue "on live"
You can make emergency changes to your live code without having to change your unminified code, minify it, upload it (obviously not something you should do regularly, but it's nice that the ability is there, if needs be).
There's one less set of build artifacts (the minified js files) to worry about keeping track of, source-controlling, etc,..
Dis-advantages:
The minification adds some overhead to script requests. This can be mitigated by either caching the minification result in memory or on disk
I'm not sure if the license for AjaxMin.dll permits this kind of use
If you're using Visual Studio / another tool that provides intellisense for script, it may not now be able to give you that from your script tags

Related

Nginx: I want to http2_push all js and css files in a directory how can I do?

files are created and deleted dynamicaly and names change over time due to css and js CMS plugin minifycation process, how can I do all js and css on a directory be pushed by Nginx?
I try do
index index.php;
http2_push 'path/to/files' *min.css; #not working
http2_push 'path/to/files' *min.js; #not working
http2_push 'path/to/file' favicon.ico; #works fine
forgive me language Im not english native
thanks for your time
Update: After looking for boredom a solution, I decided to go the long way, I modified the base plugin to create a custom one that creates files with defined name every time instead of one with a dynamic name, I have removed all text strings which denote information by removing $ ctime and $ hash from the generation of the static file.
index index.php;
http2_push 'path/to/files' static-name.min.css; #working
http2_push 'path/to/files' static-name.min.js; #working
http2_push 'path/to/file/' *.min.js; # still dont works but it does not matter anymore thanks for the answers.
Get PHP to do it for you.
First of all set up the following config in Nginx:
http2_push_preload on
Then get PHP to send preload link HTTP headers in the response to index.php:
header('Link: </styles/file.css>;rel=preload;as=style>');
Nginx will then use the preload HTTP headers as instructions to send HTTP/2 push requests.
This assumes your PHP code either knows the files you want to push or can find out.
Using preload hints also means that HTTP/1.1 requests will also get preload hints which will tell the browser to request these ASAP even before parsing the returned HTML.
The main downsides with this options are that you 1) can’t do this for static resources (e.g. if using index.html instead of index.php) and also 2) that it won’t start pushing until the index.php response is ready. For the latter HTTP Status 103 Early Hints allows a quick response but can’t find anything to suggest that Nginx supports this relatively new HTTP Header yet.

Special characters from .js file not appearing correctly initially

I deployed a .war file through IntelliJ Idea on a Tomcat server.
I noticed a character "ä" was not properly displayed while at other places the same character was displayed correctly. I found out that only the special characters that I hard-coded in my .js files were affected.
I tried to set all my .js files to UTF-8 in IntelliJ, I also changed all standard encoding settings to UTF-8 but the error didn't go away.
All my js files are mapped into one index.js file using webpack, but how exactly I don't know because this is a project initially set up by someone else.
I recently made a new interesting observation:
When I first open up a browser (tested with Firefox and Chrome) it's displayed incorrectly:
On regular reload (F5) nothing changes, but when reloading with CTRL + F5 it's suddenly correct:
This really confused me...does anyone have an idea what might be going on here?
I used to have the same problems with my Java files, but after changing the encoding in my gradle build file that worked.
Ultimately my question is:
What do you think should I change in order for the special characters to always be displayed correctly?
I add a similar problem after a tomcat update on a Windows server: the javascripts content corrupted characters at the browser side.
The http headers were corrects so I investigated a bit further.
On the server, the javascript files were saved in utf-8 without BOM.
With Wireshark, I saw that the character 'é' (C3-A9 in the file UTF-8 encoded ) was transmitted as (C3-83-C2-A9). It means that Tomcat was reading an ANSI file and gently converted it to UTF8!
So I just added the BOM to the saved the files and it fixed the bug. (REM: it is easy to add the BOM with notepad++).
But I didn't want to update all the server files, I wanted tomcat to read UTF-8 correctly.
The easy fix is to define the file encoding in the tomcat web.xml like this:
<servlet>
<servlet-name>default</servlet-name>
<servlet-class>org.apache.catalina.servlets.DefaultServlet</servlet-class>
<init-param>
<param-name>debug</param-name>
<param-value>0</param-value>
</init-param>
<init-param>
<param-name>listings</param-name>
<param-value>false</param-value>
</init-param>
<!------------------- add the settings here ------------->
<init-param>
<param-name>fileEncoding</param-name>
<param-value>utf-8</param-value>
</init-param>
<!------------------- end of the added settings ------------->
<load-on-startup>1</load-on-startup>
</servlet>
This really confused me...does anyone have an idea what might be going on here?
Caching. Ctrl+F5 tells the browser to reload the resource even if it has it cached. F5 will reuse the resource from cache if it's in cache.
What do you think should I change in order for the special characters to always be displayed correctly?
You may have already done it given the F5/Ctrl+F5 thing above.
Basically, ensure that:
The files (.js, .html, etc.) are stored in the correct encoding and, when viewed with that encoding, show the characters correctly. Strongly recommend using the same encoding for each type of file, although theoretically it's possible to use UTF-8 for JavaScript files and (say) Windows-1252 for HTML files. But that's just asking for complexity and hassle.
Ensure that every step in the pipeline correctly identifies the encoding being used for the files. That means (for instance) that Tomcat needs to include the header Content-Type: application/javascript; charset=utf-8 or similar for your .js files. (text/javascript; charset=utf-8 will also work, but is obsolete.) For HTML files, though, the W3C recommends including the meta header and omitting the charset from Content-Type.
Ensure that your HTML files identify the encoding in a meta tag near the top of head (within the first 1024 bytes) as well: <meta charset="UTF-8"> The W3C provide several reasons (same link as the bullet above) for doing this, such as saving the file locally and opening it (thus not having an HTTP header), making it clear to human and machine readers, etc.

File versions using ?v=(file version)

I have seen some sites add a version parameter to their .css and .js files like style.CSS?v=60 meaning the file is of version 60. The sites I have seen to do this cache their files long in the future.
Do I need to add any new code to my files or can I just update my code and then when a change is made change the version parameter ?v=?
If you want to employ this technique, then the first step is to set your web-server to send far-future expires headers, for certain file types (CSS, JS, images, etc).
The next step is to invalidate them, by using either a query-string, or a version-name.
?v=1.2.12
or
js/my-lib/my-file-1.2.12.js
or
js/my-lib/1.2.12/my-file.is
Any of those will work.

ETag or hash in resources URL eg. js, css, image

I want to make sure resources in web application (css, js, and images) will be refreshed every time I change them leaving the same name of the resource (so that I dont need to change references in html to it). So I guess there are at least two obvious solutions:
Include ETAG in the response header
Include Hash in Url (ie /css/{hash}/style.css)
I liked second idea better, because maybe some older proxies or browser would ignore etag - is that right ? But it is bit more difficult to put hashes into url of images in css. So finally I guess I will go with the first option for images, and both the first and the second for css and js.
What are your thougts ? Would first just be enough, and every fairly modern software will request refresh of the resource if it will change.
Concerning the second, you could use a Apache Rewrite Rule to accomplish that. I assume you could hash the date modified/created of the file with whatever programming language you are using.
.htaccess
RewriteEngine on
RewriteBase /
RewriteRule ^css/[^/]+/style.css$ css/style.css [L]
From my experience an etag is usually respected by browsers/proxies. However I've generally found that I've needed to set the last modified also. Also make sure that you have the etag formed correctly as some browsers are picky about that.

Compressing CSS and JS without mod_gzip and mod_deflate

I would like to compress the CSS and JS files on my server to minimise load times, problem.
My hosting is with Streamline.net (big mistake, never go there) who will not activate mod_gzip and mod_deflate due to security issues.
Does anyone have another way to compress these types of files (and image files too if poss) without going the way of mod_gzip and mod_deflate.
Answers would be hugely welcome.
Yes, the answer is Minification.
Obviously, it will not compress as much as gzip or deflate. But it helps, and it is very easy to do with the right tools.
You can run your files through a script which would gzip them for you and add appropriate expiration headers.
Either set up an URL rewrite, or rewrite the URLs manually:
<script src="js/somescript.js"></script>
becomes
<script src="compress.php?somescript.js"></script>
and in compress.php, you can do something like
<?php
$file = 'js/' . basename($_SERVER['QUERY_STRING']);
if (file_exists($file)) {
header ('Last-Modified: ' . date('r',filemtime($file));
header ('Content-Type: text/javascript'); // otherwise PHP sends text/html, which could confuse browsers
ob_start('ob_gzhandler');
readfile($file);
} else {
header('HTTP/1.1 404 Not Found');
}
Obviously this can be extended to also provide HTTP caching, and/or on-the-fly minification, further speeding up your visitors' browsing.
Instead of getting mod_gzip to gzip your CSS and JavaScript files dynamically, you can gzip them yourself, then upload them.
This does introduce another step before you upload CSS and JavaScript, but it works, and maybe even saves a tiny bit of server processing time for each request compared to mod_gzip.
On Mac OS X, gzipping a file on the command line is as easy as, e.g.:
gzip -c styles.css > styles-gzip.css
Make sure these files get served with the right content-type header though.
Just as a sidenote: Compressing images would not be beneficial if these are already saved in a compressed format with the maximum compression that still looks good to the user.
Most programming languages support some data or file compression formats like ZLIB or GZIP. So you could use a programming language to compress your files with one of these formats.

Categories

Resources