I've seen other approaches that attach a version number or MD5 hash to a JS src querystring.
e.g. <script src='/script/v1/'></script>
However, my JavaScript is still getting cached in multiple browsers (Chrome, Firefox) when I push a new version of my site.
This seems like a major problem that others have solved, and I seem to be doing the right things. How can I get this to work?
I added log messages and determined that the querystring method is working. Sorry for the unnecessary question.
However, in researching, I found some important points worth mentioning:
One of the articles suggests using a querystring with the current time appended. You probably don't want to follow this suggestion as your files will never be cached. Using source control version numbers or an MD5 hash would be better.
Steve Souders (of High Performance Web Sites fame) notes that certain web proxies never cache anything with a querystring. Thus, the version number should be embedded within the path to the file in order to ensure that your files are cached appropriately when accessed through these proxies. ( http://www.stevesouders.com/blog/2008/08/23/revving-filenames-dont-use-querystring/ )
It will be cached always. Although, by using a version number (or any other varying string) a new version will be downloaded and used every time, ignoring the previous one.
http://thecrmgrid.wordpress.com/2007/10/22/prevent-caching-of-javascript-include-files-during-development/
http://davidwalsh.name/prevent-cache
1.)make sure the response headers for the javascript files are correct and include expires, cache-control, etc.
2.)you probably have to append the version not as a query parameter but part of the filename, e.g. page_v.2.js. You could change the javascript filenames at build time for example if you are using Java. That is what i have done.
Related
We have a site where we're using zxcvbn by Dropbox to inform users of their password strength, however, we've been getting occasional reports that it doesn't work.
Turns out that these users (reasonably rare) are accessing our website from their workplace which has a strict corporate firewall policy, because the js file contains swearwords and NSFW words (to mark the password as insecure if it contains these commonly used words), the whole JS file is being blocked from loading.
The rest of our site loads fine, including other JS files.
How could we encrypt or minify this js file to a point where it didn't get blocked for having "bad" words in the request, but be successfully decrypted at the client side to actually do it's job and detect unsafe passwords?
This JS Fiddle will (sort of) demonstrate the problem: https://jsfiddle.net/0cgap96m/3/
<script src="https://cdnjs.cloudflare.com/ajax/libs/zxcvbn/4.4.2/zxcvbn.js" integrity="sha512-TZlMGFY9xKj38t/5m2FzJ+RM/aD5alMHDe26p0mYUMoCF5G7ibfHUQILq0qQPV3wlsnCwL+TPRNK4vIWGLOkUQ==" crossorigin="anonymous" referrerpolicy="no-referrer"></script>
<div id="test">
</div>
window.onload = function(){
var name = prompt("Put in a fake password to test?");
var passwordStrength = zxcvbn(name);
document.getElementById('test').innerHTML = JSON.stringify(passwordStrength);
};
That should work fine normally - now try blocking https://cdnjs.cloudflare.com/ajax/libs/zxcvbn/4.4.2/zxcvbn.js using an adblocker or something, and it'll obviously start failing. This is essentially what's happening for the users, but it's blocked by their corporate firewall rather than a local adblocker.
To confound the filter, you can try substituting the literal characters with JavaScript's syntax for the unicode representation of those characters.
This works even with identifiers!
var f\u006F\u006F = 'b\u0061\u0072';
console.log(foo); // outputs: bar
You could download the built js file and alter the passwords list, to split up the string in between the NSFW words. Then your copy of the library instead.
In zxcvbn.js the insecure words are defined like this (shortened here for this example)
var frequency_lists;frequency_lists=
{passwords:"123456,password,eatshit,goodluck,starcraft"}
So, by doing this:
var frequency_lists;frequency_lists=
{passwords:"123456,password,eatsh" + "it,goodluck,starcraft"}
a firewall scanning for swear words shouldn't recognize that as a swear anymore.
EDIT:
I might suggest a PR to their repo to have their code build with this format might be a better solution with the additional benefit of solving the issue for anyone else using this library, as well as allowing you to update to newer versions. But from quickly looking at the github, I see you'd need to be familiar with coffeescript + python. The original solution is much quicker and doesn't require knowledge in other languages.
how about a simple error handling client-side and a proper validation server-side?
Actually, you don't even need the validation, but if the typed/submitted password is sent to evaluation on server-side when client-side is not available may cover all the bases you need.
And if you need validation, well you should have it server-side, too, anyway, right?
I have a large 40,000 words array loading from a database into a JavaScript/HTML array on every page of our web application... What would be the best way/technology to optimize it? In order to avoid this unnecessary downloads.
Somehow keep the array in a cookie and read from there?
Use ajax to load the array dynamically only parts that are needed?
What is the common practice?
On modern browsers you can use sessionStorage to have it persist during the current session, or localStorage to have it hang around between sessions.
NB: both only permit storage of strings - you'll have to serialise the array (e.g. into JSON) and deserialise it on retrieval.
If you want to actually use the word list as a local database with efficient lookup you might also want to investigate indexedDB
you can place the data in session and retrieve it, the same can be used in every page with out fetching the same every time.
Thanks & best regards.
If you need all the 40k words in all pages then you can use localStorage or sessionStorage. Just keep in mind sessionStorage will delete saved data when the tab/window is closed so the whole array will be downloaded again when the website is opened in new windows/tabs.
If you need only specific parts of the array in different pages I would tidy the array's elements into taxonomy/categories (if you are able to), so that you can download only the needed for a specific section of your application.
This depends on the composition of your array, if it is formed only by words or complex objects. This will help to avoid slow load of your website when it's visited the first time.
If the array is always the same (there is no need to update it), I'd create a js file and then I'd add it to every html page. The browser's cache would do the rest to avoid unnecessary re-loading. Something like:
big-array.js file:
var myBigArray=[...]
In each html file
<html>
... whatever you need
<script src="/my-path/big-array.js"></script>
...my other scripts here
</html>
It's a bit difficult to answer this question properly as to do so would require more information about your hosting environment and what you have access to. If you have a server side language available, such as PHP, you could look at caching which is generally the most efficient way to handle data that is used repeatedly across pages. Perhaps you could post more info about what technologies you have available to you?
I am working with asp.net mvc. I note when calling controller actions that return a view via javascript, the html markup returned is not minimized - it includes whitespace etc. Therefore the response size is larger than what it should be.
Is there a way to minimize the response from calling a controller action from javascript?
You might want to look into creating a custom filter to be applied to responses that you want to minify. A technique for this is given in this answer or in this blog post, though you will need to be sure that your implementation of the minification (removing whitespace) does not inadvertantly mess up your code (for example, if you have a javascript content, removing all newline characters can result in all of the following javascript being included in the comment, per this comment).
To this end, it may be worthwhile to use the C# port of Google's htmlcompressor library as a guide for minifying your html.
Of course, you can also just turn on gzip compression on the web server (as Justin points out in the comment below), and get the benefits of compressed output without the headache of implementing (and maintaining) what I detail above.
Note: this may not be worth the effort. A few extra spaces and newline characters in the file that is being sent down the wire will probably not amount to very much space. Even if you save a few KB (which may not even be the case), the increase in performance will most likely not be noticeable. You will however notice that when you try to look at the source of your html in order to debug any issues that you have on the client side, it will be extremely hard to read (spaces and new lines are pretty important for readability).
I've got a web app which gets a couple dozen items at boot. All these items are JSON and are smaller then 1kb.
Now there are a number of storage options as seen in the Question.
I was thinking of just storing these objects inside a variable in the browser JS. I don't really see why I would want to use any of these browser storages?
So what would be reasons to use any of the browser-based storage instead of a variable inside JS.
Could be that from a certain data size it is preferable to use browser storage, e.g. from 100kb onwards it's better to not use a JS variable.
var myModel = {}
NOTE
Every time the user enters the app he will get fresh content from the server. The content is too realtime for caching.
`
localStorage , globalStorage and sessionStorage:
These features are ready in browsers that have implemented the "Web Storage", they all refer to a kind of HashMap, a map between string keys and string values. but the life is different. once the active page is closed sessionStorage would be cleaned but the localStorage is permanent.(MDN DOM Storage guide)
There is a point about globalStorage, which is its being obsolete since Gecko 1.9.1 (Firefox 3.5) and unsupported since Gecko 13 (Firefox 13), since then we should use localStorage. the difference between these 2 was just the HTML5 scope support(scheme + hostname + non-standard port).
These could be useful for you to:
-Share your objects between your different pages, in your site.
-Offline programming.
-Caching large object
-Or whenever you need to a local persistent storage.
IndexedDB:
IndexedDB is useful for applications that store a large amount of data (for example, a catalog of DVDs in a lending library) and applications that don't need persistent internet connectivity to work (for example, mail clients, to-do lists, and notepads)
based on this quote from MDN you can easily find your answer out, regarding using IndexedDB, if you don't know whether IndexedDB is useful for you or not, just answer these questions:
Do you store a large amount of data on client? if yes, so consider using it.
Does your app need to be offline enabled? if yes, so consider using IndexedDB.
Does your app need to a persistent internet connectivity? If yes, it stays still an option, based on the other factors.
So other than working offline as far as you don't need it, I guess, because as you said:
The content is too realtime for caching.
These have some features like sharing objects, and managing large amount of data, which you should be the one to decide.
localStorage and sessionStorage are solving a caching problem; think of them as cookies. You've said you don't want caching, so you can ignore them.
JavaScript objects behave basically like O(1) lookup tables (see How is a JavaScript hash map implemented?, and make sure you read both the top two answers, as both have something useful to say), and there is no maximum memory limit that I am aware of, or a point where another solution becomes a better choice
The only reason I can think of that you should bother with the extra step of inserting the data in an IndexedDB is if you need O(1) lookups on a field that is not the object key you are using.
I have a PHP application that makes extensive use of Javascript on the client side. I have a simple system on the PHP side for providing translators an easy way to provide new languages. But there are cases where javascript needs to display language elements to the user (maybe an OK or cancel button or "loading" or something).
With PHP, I just have a text file that is cached on the server side which contains phrase codes on one side and their translation on the other. A translator just needs to replace the english with their own language and send me the translated version which I integrate into the application.
I want something similar on the client side. It occurred to me to have a javascript include that is just a set of translated constants but then every page load is downloading a potentially large file most of which is unnecessary.
Has anyone had to deal with this? If so, what was your solution?
EDIT: To be clear, I'm not referring to "on-the-fly" translations here. The translations have already been prepared and are ready to go, I just need them to be made available to the client in an efficient way.
How about feeding the javascript from php? So instead of heaving:
<script type='text/javascript' src='jsscript.js'></script>
do
<script type='text/javascript' src='jsscript.php'></script>
And then in the php file replace all outputted text with their associated constants.
Be sure to output the correct caching headers from within PHP code.
EDIT
These are the headers that I use:
header('Content-type: text/javascript');
header('Cache-Control: public');
header('expires: '. date("r", time() + ( 7 * 24 * 60 * 60 ) ) ); // 1 week
header("Pragma: public");
I usually load the appropriate language values as a JavaScript object in a separate file which the rest of my code can reference:
var messages = {
"loading": "Chargement"
}
alert(messages.loading);
The language library will be cached on the client side after the first load and you can improve load efficiency by splitting values into separate files that are loaded based on context: e.g. a small library for public operations, an additional one behind a login, etc.
What you are looking for seems to be AJAX (client and server exchanging asyncronous requests using Javascript).
If you're looking for something ready-made, take a peak at Google Translation AJAX APIs.
I have never personally had to do this, but my first guess would be to reference a third party library. I believe the Google AJAX API might have just the tool to do this, and since the library is through Google, the javascript file will not be downloaded from your site.
Try digging through the following site for a bit: http://code.google.com/apis/ajaxlanguage/documentation/
Let me know if that helps you out. I might be interested in implementing something similar for the website I manage.
Good Luck,
C
I like Pim Jager's answer, and have done that myself. If you don't want to have the PHP generate the JavaScript for you (say, for performance reasons), you can have a master copy of the JavaScript libs, and when they change, run a translation program to generate a version of each lib for each language. Then just have the PHP put the right version for the current user in the script tag it sends.
I worked on a system that needed both heavy localization and heavy branding for different customers. What we did was anything that got sent to the screen had a unique macro, like [3027] (or something like that). Then we had a bunch of locale and branding files that had entries for each macro code, and the text to substitute for each macro. A program would loop through all the source files and all the languages and make the substitutions.
We found we also needed some functions for localization, for monetary amounts, dates, times, etc. It all worked pretty well.
We found we needed one more important thing: A tool to go through all the language files and make sure they all had all the necessary codes. Big time saver.