Detect spoofing of JavaScript files - javascript

Let's say I'm developing bunch of JS widgets that are intended to be embedded on any webpage (sort of iGoogle, Pageflakes widgets).
The client can embed the widgets by including a script tag:
<div id="widgetHost">
<script src="http://fantasticwidgets.net/awesomeWidget.js"></script>
<script src="http://fantasticwidgets.net/awesomeWidgetAgain.js"></script>
</div>
Now these widgets rely on common libraries (let's jQuery, underscore, and some of my own - e.g. myCommon.js).
Ideally, this is what should happen:
Widget's bootstraper js is downloaded first
Widget's bootstraper js checks for presence of these library files with their required version (say whether jQuery v1.6.2 is loaded on the page or not, myCommon v1.1 etc)
If any of those is loaded, don't request those files, but download missing scripts only.
These scripts then call some webservice, do some magic and render HTML on page
Concern: Checking for files already loaded introduces a vulnerability of script spoofing. A malicious user spoofs the libraries once loaded which he uses to steal sensitive info or do other bad things.
Solution: Do not check for loaded libs, send all of them again always. This is still not bulletproof but at least makes it a little harder as he has to spoof again.
However, this causes wasted bandwidth and increased loading time.
Question: Is it possible to detect if the loaded files are tampered with, preferably on the client side? Or does it have to include a server side solution? I've ASP.Net running on the server side if that matters.

The only real solution here is "use HTTPS to deliver the scripts." If the bad guy can go so far as to poison the user's browser cache with HTTPS content from another domain, it's already game over for you, because he would also have power to change the page you deliver to the user.

Related

Force caching of certain Javascript Library files (ie react.min.js, etc.) between pages?

Is it possible to force caching of certain Javascript Library files (ie react.min.js, etc.) when navigating between pages of a website that isn't a SPA?
Trying to look at the feasibility of a more componentized structure while not going full on SPA. The website I'm working on oftentimes has people visit a single page and then leave, but in cases where they do stick around, I don't want to have to have them reload each and every library on page load.
Background You Should Understand
There are literally thousands of articles on the web about this topic but here is a very good summary from Make Us Of's Everything You Need to Know About the Browser Cache.
The browser cache is a temporary storage location on your computer for files downloaded by your browser to display websites. Files that are cached locally include any documents that make up a website, such as html files, CSS style sheets, JavaScript scripts, as well as graphic images and other multimedia content.
When you revisit a website, the browser checks which content was updated in the meantime and only downloads updated files or what is not already stored in the cache. This reduces bandwidth usage on both the user and server side and allows the page to load faster. Hence, the cache is especially useful when you have a slow or limited Internet connection.
TL;DR
I don't know if your really looking for a way to force the browser to cache your files or if you just misunderstood how the cache works. In general the browser the visitor is using is the one that makes that decision and handles everything for you. If it sees that a resource is needed that was already accessed in the past it wont request it again, it'll just use its cache. So no, your libraries will not get re-loaded over and over. Just once.
Now if you really do need to force the browser to cache your files take a look at the answer(s) to Caching a jquery ajax response in JavaScript/browser. That should get you on a good path to a solution.

Calling bootstrap's css and js files from CDN or from root file

There are two ways to call bootstrap's css and js files.
From CDN
From root file
My question is what way is the best way to work.
This question is essentially "Should I use a CDN?".
The answer boils down to the following factors.
You will need an internet connection all the time, even to test your code.
If you're on the move and don't have an internet connection, or one of your clients doesn't have one when you're demoing your code, then you're in trouble if you're using a CDN.
The CDN will most probably be faster.
CDNs are designed for the sole purpose of serving files, fast. Most of the time, there are several mirrors serving the content, so files can be served fast to users around the world. Also, if you host it from your own domain, you might also include several cookies every time you serve the file, while the CDN will not.
The CDN might go offline.
This is obvious, but it's a concern nonetheless. Check the reputation of the CDN you're planning to use.
Your bandwidth usage will be minimised if you use the CDN.
If you host your site up somewhere, and your host has a limit on the amount of data it will transfer for you, commonly called 'bandwidth', then a CDN will help, since it won't count against your usage.
Keeping these factors in mind, the final choice is yours. However, for JS at least, I recommend using a CDN with a local fallback. An example done using jQuery can be found at https://stackoverflow.com/a/1014251/2141058.
The CDN serves all the file for you, and will probably be faster than hosting them yourself. However, all you clients must have an internet connection. If your client will have an internet connection (if you are just writing a normal website), you should probably use the CDN.
If your client might not have an internet connect (if this site is internal to a company network) you should serve the files yourself, as your clients won't be able to reach the CDN via the internet.
A foundation of a Content Delivery Network is that a different url/ip address then the primary domain of your website can receive client requests and response with resources.
This means the client's browser can and will make requests for resources that are from different urls/ips in parallel. So when you have a js file on your website, say www.mywebsite.com/site.js, the client's browser will request this resource and queue up the next resource, if the resource is from a CDN, aka a different url/ip, the client's browser can make a request to a different server in parallel. So if you are requesting, cdn.bootstap.com/boostrap3-min.js, this resource can be downloaded to the client without waiting for www.mywebsite.com/site.js to finish downloading to the clients browser.
So this means the overall website load time will be faster. but will it be human perceivable faster? If this is a website built for mobile devices then you may take advantage of a CDN because you need to optimize load times whenever you can. If the primary use of this website is a desktop website, with mild traffic, in a geographic region where broadband/decent connections is common place, it is not really worth having a CDN unless it is your own CDN server.
The bootstrap CDN server cannot be relied on, it is not part of your infrastructure. You should not trust that those that maintain the Bootstrap CDN will always have it up when people are going to your website. Sure, most likely, the CDN will be up and there is never an issue but if the Bootstrap CDN where to go down and your website doesn't look right or doesn't even work...You don't have the level of control you should have over the situation.
Also, using a CDN to a different domain and IE9 and IE10 and iFrames can cause issues, I wrote about this in my blog here:
http://www.sandiegosoftware.net/blog/script5-access-is-denied-error-in-internet-explorer-loading-jquery/
Use your own CDN server with a subdomain of the primary website domain or don't use a CDN is the best rule of thumb.

any way to auto-execute custom js lib function on external websites?

I was wondering if there's any way to attach a js lib to an external webpage after the page has loaded?
To provide a simple example, could I load www.google.com into IE and somehow display the webpage with a green scroll bar?
I would like this process to happen automatically on each page load instead of having to manually execute this process on each page load.
I am assuming that you are talking from a web developer's point of view.
I don't think it is possible without any hacks.
This would also be a huge security risk, because loading javascript code on an external website means that the code can potentially do anything on behalf of the user. It can capture keystrokes, take screenshots, note down passwords and do a lot of illegal stuff.
So instead of this, you can create a browser extension (add-on) which will have to be installed by user's permission (and his knowledge), and can run any code on any page (if the user allows it)

How can I ensure that the latest version of my javascript code is loaded for the client?

We have a client with thousands of users (who all use Internet Explorer) and a large amount of javascript files that enhance their user experience with our product.
The problem I'm having is that any time we update one of these scripts there is no way to know whether the client is seeing the latest version. What we're having to do is tell our client to do a hard refresh (ctrl+f5) before viewing any changes. Obviously this approach is not ideal.
I know that browsers cache based on the url, so one could use something like
<script src='myScript.js?ver=1.2'>
to get around the issue, but this is not an option for us.
I was hoping that there's some kind of header property or something similar that we could use to tell IE not to cache these scripts.
Any ideas?
You can also version the filename itself like jQuery does:
<script src='myScript-v1-2.js'>
Then, each time you revise the script, you bump the version number and modify the pages that include it to point to the name of the new script. This is foolproof vs. caching, yet still allows your viewers to receive the maximum benefit of caching and requires no server configuration changes for the .js file.
A full solution will typically include setting a relatively short cache lifetime for your host web page and then allow the various resources (stylesheet files, JS files, images, etc...) to have longer cache lifetimes for maximum caching. Anything that is fingerprinted can have a very long cache lifetime. See the reference that fabianhjr posted about for ways to set the cache lifetime of the host web page. It can be done in the web page itself (<meta> settings) or in the http headers via the server.
If you turn off caching for your script file (which would likely have to be done at the web server level for a script file) then all your viewers will lose the performance benefit of caching and you will lose the bandwidth and load-saving benefit of caching. If you use a common .JS file across many pages (a common design pattern), your viewers will see slower performance on every page.
Everything you need to know about cache http://www.mnot.net/cache_docs/
http://www.mnot.net/cache_docs/#CACHE-CONTROL <-- HTTP Headers

How can I give users a javascript widget to pull content securely from my site

I need to be allow content from our site to be embeded in other users web sites.
The conent will be chargeable so I need to keep it secure but one of the requirements is that the subscribing web site only needs to drop some javascript into their page.
It looks like the only way to secure our content is to check the url of the page hosting our javascript matches the subscribing site. Is there any other way to do this given that we don't know the client browsers who will be hitting the subscribing sites?
Is the best way to do this to supply a javascript include file that populates a known page element when the page loads? I'm thinking of using jquery so the include file would first call in jquery (checking if it's already loaded and using some sort of namespace protection), then on page load populate the given element.
I'd like to include a stylesheet as well if possible to style the element but I'm not sure if I can load this along with the javascript.
Does this sound like a reasonable approach? Is there anything else I should consider?
Thanks in advance,
Mike
It looks like the only way to secure our content is to check the url of the page hosting our javascript matches the subscribing site.
Ah, but in client-side or server-side code?
They both have their disadvantages. Doing it with server-side code is unreliable because some browsers won't be passing a Referer header at all, and if you want to stop caches keeping a copy of the script, preventing the Referer-check from taking place, you have to serve with nocache or Vary: Referer headers, which would harm performance.
On the other hand, with client-side checks in the script you return, you can't be sure your environment you're running in hasn't been sabotaged. For example if your inclusion script tag was like:
<script src="http://include.example.com/includescript?myid=123"></script>
and your server-side script looked up 123 as being the ID for a customer using the domain customersite.foo, it might respond with the script:
if (location.host.slice(-16)==='customersite.foo') {
// main body of script
} else {
alert('Sorry, this site is not licensed to include content from example.com');
}
Which seems simple enough, except that the including site might have replaced String.prototype.slice with a function that always returned customersite.foo. Or various other functions used in the body of the script might be suspect.
Including a <script> from another security context cuts both ways: the including-site has to trust the source-site not to do anything bad in their security context like steal end-user passwords or replace the page with a big goatse; but equally, the source-site's code is only a guest in the including-site's potentially-maliciously-customised security context. So a measure of trust must exist between the two parties wherever one site includes script from another; the domain-checking will never be a 100% foolproof security mechanism.
I'd like to include a stylesheet as well if possible to style the element but I'm not sure if I can load this along with the javascript.
You can certainly add stylesheet elements to the document's head element, but you would need some strong namespacing to ensure it didn't interfere with other page styles. You might prefer to use inline styles for simplicity and to avoid specificity-interference from the page's main style sheet.
It depends really whether you want your generated content to be part of the host page (in which case you might prefer to let the including site deal with what styles they wanted for it themselves), or whether you want it to stand alone, unaffected by context (in which case you would probably be better off putting your content in an <iframe> with its own styles).
I'm thinking of using jquery so the include file would first call in jquery
I would try to avoid pulling jQuery into the host page. Even with noconflict there are ways it can conflict with other scripts that are not expecting it to be present, especially complex scripts like other frameworks. Running two frameworks on the same page is a recipe for weird errors.
(If you took the <iframe> route, on the other hand, you get your own scripting context to play with, so it wouldn't be a problem there.)
You can store the users domain, and a key within your local database. That, or the key can be an encrypted version of the domain to keep you from having to do a database lookup. Either one of these can determine whether you should respond to the request or not.
If the request is valid, you can send your data back out to the user. This data can indeed load in jQuery and and additional CSS reference.
Related:
How to load up CSS files using Javascript?
check if jquery has been loaded, then load it if false

Categories

Resources