get fresh copy of HTML, CSS and Javascript files resources - javascript

I have an ASP.NET MVC 5 application, this application includes HTML, CSS, and Javascript files when I do some update in HTML, CSS or Javascript and deploy it in the server, to see that change I have to clear the cache in local PC.
This is a common question, but answers are not straight as I see.
So I wish to know, can I include a function to refresh the existing HTML, CSS and JS files using javascript approach. when the application load, gets the HTML, CSS and JS files as fresh copies. This is like clear the cache in the browser using javascript.
If cannot achieve this any other alternative?

Since you're using MVC 5, this can automatically be handled for JS and CSS for you, if you use bundling and minification.
The request ...
for the bundle ... contains a query string pair....
The query string ... has a value token that is a unique identifier
used for caching. As long as the bundle doesn't change, the ASP.NET
application will request the ... bundle using this token. If
any file in the bundle changes, the ASP.NET optimization framework
will generate a new token, guaranteeing that browser requests for the
bundle will get the latest bundle.
Essentially, the ASP.NET application will handle the caching of JS and CSS for you.
You can read more about how this works and how to implement it on the MSDN article for Bundling and Minification.
I'm not sure what you mean by caching HTML. If you're using a .cshtml file, it shouldn't be cached by default as far as I know. However, you can specify on your controller how long to cache the results of an action for, such as in this post:
You can use the OutputCacheAttribute to control server and/or browser
caching for specific actions or all actions in a controller.
If you provide my detail on how the HTML is being cached, I might be able to provide a more helpful answer.

Related

How to generate PDF from .tex in browser, using memoir document class?

I'm writing a single-page web-app custom data editor, which currently does not require any server side code to function.
The app generates a .tex file as output. The generated file uses memoir document class and does some complex formatting stuff that is hard to reproduce outside of a *TeX ecosystem.
I would like to let users download PDFs, not .tex files.
I would prefer to generate these PDFs in browser, client-side. I would settle for server-side generation though.
I tried texlive.js, but it lacks memoir. As for the server side, I would like to avoid setting a tex to pdf generation pipeline from scratch --- I feel that it would be a security nightmare.
Any advice?
Basically you just have to compile your own version of texlive.js.
The instructions are here. Just add the memoir package.
Server side renderers do exist. However I would recommend to auto-generate a VM for every run. Like CI-tools do it.

Prevent circumventing ASP.NET minification

I've got some ASP.NET that I'm deploying as an Azure cloud service. The javascript files have comments in them that I'd like not to be visible to anyone consuming the JS. I'm taking advantage of ASP.NET bundling and minification:
http://www.asp.net/mvc/overview/performance/bundling-and-minification
This seems to be a nice solution in that it removes all comments during the minifcation process. But I can't count on the fact that the user won't directly point his or her browser directly to the individual, original js files. I'm trying to figorue out how to prevent the user from pulling the js files directly (forcing them to pull only a bundle), in order to prevent viewing comments. Is there a way to implement a black list of files that can't be downloaded? If not, I was thinking of adding a series of random characters to the name of each js file. Lastly, if that doesn't seem like a good idea, I would investigate injecting something into the VS build process to strip comments on publish.
Any thoughts would be welcome.
You can use blockviewhandler in a web.config in the folder your js is in. Explicitly whitelist any files that are OK to download and then block the rest.
There's an example in this question:
Where to put view-specific javascript files in an ASP.NET MVC application?
I think you can modify your deployment process. To your production server upload only the minified js files but to your test/dev server upload everything.

Loading external Javascript into rails application

I want to use a CDN to load in bootstrap and jquery in an attempt to improve site performance. With performance in mind, which of the following is the best way of doing this:
1. Add in a script tag directly into a html or layout file
<script src="//netdna.bootstrapcdn.com/bootstrap/3.0.2/js/bootstrap.min.js"></script>
2. Dynamically load the content into the middle of the asset pipeline as discussed by Daniel Kehoe here under 'Dynamic Loading'.
As I assume that what ever the link or different repository used for any file other than our code base, will reflect some issue of availability.
Here bootstrap js file will always depends on the speed of netdna
domain server. Server down or failure will affect our performance as
well as reliablity of our system. Such thing will not happen frequently but may be chance.
I will suggest as of my experience, the best way is to keep the same file on our server in compressed form to avoid such future issues and updated that file at regular interval as update release.
Reduce DNS Lookups
According to Yahoo! Developer Network Blog, it takes about 20-120
milliseconds for DNS (Domain Name System) to resolve IP address for a
given hostname or domain name and the browser cannot do anything until
the process is properly completed.
Merge Multiple Javascripts Into One
--> Folks you can combine multiple Javascripts like for example:
http://www.example.com/javascript/prototype.js
http://www.example.com/javascript/builder.js
http://www.example.com/javascript/effects.js
http://www.example.com/javascript/dragdrop.js
http://www.example.com/javascript/slider.js
Into a single file by changing the URL to:
http://www.example.com/javascript/prototype.js,builder.js,effects.js,dragdrop.js,slider.js
Compress Javascript / CSS
There are also some web services that allow you to manually compress your Javascripts and CSS files online. Here are few we come to know:
compressor.ebiene (Javascript, CSS)
javascriptcompressor.com (Javascript)
jscompress.com (Javascript)
CleanCSS (CSS)
CSS Optimizer (CSS)

What are those cache.js and compilation-mappings files

Recently I received a package with web page. I see inside (beside normal html and js files) there are some JS files. It looks like this:
4A3674A3247236B3C8294D2378462378.cache.js
FE728493278423748230C48234782347.cache.js
compilation-mappings.txt
Inside .js files I see Javascript which is obfuscated or minified. Inside compilation-mappings.txt the cache.js are referenced. Are these files generated by some kind of WEB IDE? Unfortunately I have no chance to get information how this wep page was developed.
That is a web project coded in Java and compiled to JS using the GWT project tools.
GWT compiler does a lot of the work you would have to do manually when coding JS by hand, and some other tasks which are almost impossible in a normal JS project: obfuscate, compress, death-code removal, different optimization per browser, renaming of the scripts, code splitting, etc.
What you have in your app is the result of this compilation:
First you should have a unique index.html file, because GWT is used to produce RIA (Rich Internet Applications) also known as SPI (Single Page Interface).
The unique html file should have a reference to a javascript file named application_name.nocache.js. Note the .nocache. part, meaning that the web server should set the appropriate headers, so as it is not cached by proxies nor browsers. This file is very small becaust it just have the code to identify the browser and ask for the next javascript file.
This first script knows which NNNN.cache.js have to load each browser. The NNNN prefix is a unique number which is generated when the app is compiled, and it is different for each browser. GWT supports 6 different browser platforms, so normally you would have 6 files like this. Note the .cache. part of the name, meaning that this files could be cached for ever. They are large files because have all the code of your application.
So the normal workflow of your app is that the browser ask for the index.html file which can be cached. This file has the script tag to get the small start script applicaton.nocache.js which should be always requested to the server. It has just the code for loading the most recent permutation for your browser NNNN.cache.js which will be downloaded cached in your browser for ever.
You have more info about this stuff here
The goals of this naming convention is that the next time the user goes to the app, it will be in cache the index.html and NNNN.cache.js files, asking only for the application.nocache.js which is really small. It guarantees that the user loads always the most recent version of the app, that the browser will download just once the code of your app, that proxies or cache devices do not break your app when releasing a new version, etc.
Said that, it is almost impossible to figure out what the code does inspecting the javascript stuff because of the big obfuscation. You need the original .java files to understand the code or make modifications.
I can't say for sure, but often a string will be attached to the name of a javascript file so that when a new version is deployed clients will not use a cached version of the old one.
(ie, if you have myScript.js and change it, the browser will say "I already have myScript.js, Idon't need it. If it goes from being myScript1234.js to myScript1235.js the browser will go fetch it)
It is possible the framework in use generated those files as part of it's scheme to handle client side cache issues. Though without knowing more details of what framework they used, there's no way of knowing for sure.

How to handle javascript & css files across a site?

I have had some thoughts recently on how to handle shared javascript and css files across a web application.
In a current web application that I am working on, I got quite a large number of different javascripts and css files that are placed in an folder on the server. Some of the files are reused, while others are not.
In a production site, it's quite stupid to have a high number of HTTP requests and many kilobytes of unnecessary javascript and redundant css being loaded. The solution to that is of course to create one big bundled file per page that only contains the necessary information, which then is minimized and sent compressed (GZIP) to the client.
There's no worries to create a bundle of javascript files and minimize them manually if you were going to do it once, but since the app is continuously maintained and things do change and develop, it quite soon becomes a headache to do this manually while pushing out new updates that features changes to javascripts and/or css files to production.
What's a good approach to handle this? How do you handle this in your application?
I built a library, Combres, that does exactly that, i.e. minify, combine etc. It also automatically detects changes to both local and remote JS/CSS files and push the latest to the browser. It's free & open-source. Check this article out for an introduction to Combres.
I am dealing with the exact same issue on a site I am launching.
I recently found out about a project named SquishIt (see on GitHub). It is built for the Asp.net framework. If you aren't using asp.net, you can still learn about the principles behind what he's doing here.
SquishIt allows you to create named "bundles" of files and then to render those combined and minified file bundles throughout the site.
CSS files can be categorized and partitioned to logical parts (like common, print, vs.) and then you can use CSS's import feature to successfully load the CSS files. Reusing of these small files also makes it possible to use client side caching.
When it comes to Javascript, i think you can solve this problem at server side, multiple script files added to the page, you can also dynamically generate the script file server side but for client side caching to work, these parts should have different and static addresses.
I wrote an ASP.NET handler some time ago that combines, compresses/minifies, gzips, and caches the raw CSS and Javascript source code files on demand. To bring in three CSS files, for example, it would look like this in the markup...
<link rel="stylesheet" type="text/css"
href="/getcss.axd?files=main;theme2;contact" />
The getcss.axd handler reads in the query string and determines which files it needs to read in and minify (in this case, it would look for files called main.css, theme2.css, and contact.css). When it's done reading in the file and compressing it, it stores the big minified string in server-side cache (RAM) for a few hours. It always looks in cache first so that on subsequent requests it does not have to re-compress.
I love this solution because...
It reduces the number of requests as much as possible
No additional steps are required for deployment
It is very easy to maintain
Only down-side is that all the style/script code will eventually be stored within server memory. But RAM is so cheap nowadays that it is not as big of a deal as it used to be.
Also, one thing worth mentioning, make sure that the query string is not succeptible to any harmful path manipulation (only allow A-Z and 0-9).
What you are talking about is called minification.
There are many libraries and helpers for different platforms and languages to help with this. As you did not post what you are using, I can't really point you towards something more relevant to yourself.
Here is one project on google code - minify.
Here is an example of a .NET Http handler that does all of this on the fly.

Categories

Resources