Hosted javascript libraries - javascript

Besides Google Libraries API what other services are there for hosted javascript libraries?
Please only list trusted sources, not some unknown third party.

Microsofts CDN
http://www.asp.net/ajaxlibrary/cdn.ashx

Before you go in search of hosted JavaScript libraries, you should consider the fact that any JavaScript that you include in your web page runs within the context of your domain and can access any data rendered on the web page or that the user can normally access on your domain. Using Google's hosted JavaScript is fine, but if its some third party you never have heard of, you might want to think twice.
Perhaps it would be better to search for high-quality JavaScript libraries and download your own copy that you maintain within your domain on your own servers (and can audit for security purposes)?
Out of curiosity... what specific functionality are you looking for?

There's also Yahoo YUI (http://developer.yahoo.com/yui/) though I believe they only host YUI itself. Make sure you pay attention to Michael Safyan's answer, too - who you're willing to trust with your users' code should be a carefully made decision. Beyond that, if you're looking for generic JS hosting you should make sure you really need it - a minified version of jQuery or MooTools is incredibly tiny, and shouldn't make any real difference either to your server's CPU usage or bandwidth expenditure.
It also doesn't meaningfully affect the maintainability of your HTML or JS, and it introduces another point of failure in your implementation.

Related

Why does HTML5Boilerplate and others use a CDN for jQuery?

HTML5Boilerplate, and others[citation needed], load jQuery this way, as we all know:
<script src="//ajax.googleapis.com/ajax/libs/jquery/1.10.2/jquery.min.js"></script>
<script>window.jQuery || document.write('<script src="js/vendor/jquery-1.10.2.min.js"><\/script>')</script>
Is this better for users? It's good practice to keep DNS lookups low, and unless we're also grabbing jQuery-UI or other frameworks from Google, then this is the only resource we get from their CDN. Would serving it from our own servers be faster?
Is this better for the server? Are we really saving that much by using Google's CDN for just this one relatively small file, rather than serving it ourselves?
Why just jQuery? Why just Google? HTML5Boilerplate includes normalize.css, and modernizr.js, both of which are popular files (arguably more popular and more of a staple than jQuery[disputed]) and are available at http://cdnjs.com/ and at a number of other CDNs. If we're loading jQuery, why not those 2? Is jQuery loaded from a CDN because it's deferred to load at the bottom of the page, and therefore it's OK to wait an extra .1s to get it from a CDN? I know Google CDN is a giant, but it's not unimaginable that other CDNs could handle a good amount of traffic.
Edit: Looking at Stack Overflow's code for this very page, they use their own CDN for 10+ resources, and then use Google for jQuery. There has to be a good reason for this, right?
Is this better for users?
Yes, especially popular libraries, because they might already be cached on the user's browser. Google servers are faster, and more reliable that your server; and so are most CDN's.
Is this better for the server?
Well, you serve less data.
Why just jQuery? Why just Google?
Not just jQuery, not just Google. The point is to use CDN when you can (and fallback to your server version) to benefit from speed and caching. Normalize is very small, but I think you can still benefit from using CDN. As for Modernizr, you want to be using a custom version, built for your needs, that's the recommended way to use the library.
Some of the said point of using a cdn is that users may have it in their cache and they will not have to download the file specifically for you website.
As for using Google CDN, They are huge and have amazing infrastructure; so why not?
Also I believe google is affiliated with html5boiler plate (At least one of the maintainers works for google)
When you pull jQuery from Google, there is a good chance it is already cached from a previous site, preventing an additional download.
We've discussed this question on the repo several times. This was the most recent. Lots of opinions, some testing, some super smart, super experienced people offering their 2 cents there. A fun, instructive discussion.
In general, anything you're wondering about with h5bp has been discussed in public somewhere.
To boil it down, the CDN is used because it's the best default configuration. If people download HTML5 Boilerplate and do nothing else to the code, having it linked to the Google CDN is the best default. It offers geographical optimization, fast servers, a cookieless domain and a chance of hitting the cache lottery.
It's also, by far, the most popular CDN, so if you're going to use a 3rd party service that's the one to use if you want a decent chance of getting a cached copy.
Again, there's a ton of detail in the issue if you have any other questions.

Security in embedded javascript and HTML

I'm trying to find a solution for the following situation:
I've a web application made of HTML, javascript, AJAX, ad so on.
I want users to contribute to my application/website creating plugin that will embedded in it.
This plugin will be created using similar technologies (ajax, HTML, etc) so i need to allow plugins to run their own javascript code.
Each plugin will work in a page that will contain some user information and the plugin (like old fbml facebook applications)
The problem is that in this way the plugin can also made calls to get users information. (because since plugin's code is embedded it's domain will be the same of the main website, and the code will be entirely on my website).
So the question is: how can I avoid it and have a precise control about what information a plugin can get about the user?
The plugin will not be checked and can be changed anytime, so reading all the plugin code is not a solution.
I'm open to any proposal, possibly easy and effective, and possibily not putting the whole plugin in a iframe.
--
EDIT:
How did facebook do when there was the old way to create applications? (now it's only iframe, but there was FBML application way, how did they get this secure?)
Have you ever heard of exploits allowing arbitrary code execution. Which is one of the most dangerous attacks ?
Well, in this case you are explicitly and willingly allow arbitrary code execution and there's almost no way for you to sand box it.
1) You can run the "plugin" within an iframe from a different subdomain to sandbox it in there, as you've mentioned. This way plugin can't reach your cookies and scripts.
Note that, if you want the plugins to communicate with your services from this domain, then it will be cross-domain communication. So you either need to resort to JSONP or use new cross domain access control specifications. (i.e. return appropriate headers with your web service response -- Access-Control-Allow-Origin "plugins.domain.com")
2) Create your own simple scripting language and expose as much as you want. This is obviously tedious, even if you manage to do that, plugin developers will endure a learning curve.
Facebook had their own "JavaScript" coined FBJS which did the sandboxing by having control over what could run.
Without a juicy backend, this really limits the impact of your script.
However you still have to worry about DOM based xss and Clickjacking
It's 6 years later, but I feel it's important to provide a modern solution to this. The new(er) sandbox attribute can be used to limit the capabilities of an IFrame.
A simple implementation of this system would allow only the allow-scripts permission to the IFrame, perhaps with a simple JS file which would be included along with each plugin containing a few custom library functions.
In order to communicate with your HTML page, you would use postMessage. On the plugin end, a library like I mentioned above could be used to transfer commands. On the user side, another system would have to validate and decode these requests then execute them.
Since a sandboxed IFrame doesn't have cross origin capabilities, it cannot directly modify the page. However, this also means the origin of the postMessage can't be verified, so some sort of code would have to be created for security reasons.

Lightweight JS Library vs Google-hosted CDN

When page-load speed is the priority, is it better to use a minimal, lightweight javascript library (hosted on a CDN), or is it better to use something like jQuery, hosted on Google's CDN that the browser more than likely already has loaded?
Edit: What my question really boils down to is whether the cross-site caching effect of using jQuery hosted on Google's CDN outweighs the benefits of using an ultra-light library, also on a CDN.
jQuery is not heavy as compared to any other javascript library at present looking at the amount of features and browsers it supports.
You can consider this factor while selecting the plugins to be used on the page because they are written by various users and some may right it intelligently considering this factor or some may just right it for the sake.
Yes, if you use CDN like Google for jQuery it is most likely that the library must be cached by the browser and also Google has number of servers based on location so you don't have to worry about it.
Decreased Latency
A CDN distributes your static content across servers in various, diverse physical locations. When a user’s browser resolves the URL for these files, their download will automatically target the closest available server in the network.
In the case of Google’s AJAX Libraries CDN, what this means is that any users not physically near your server will be able to download jQuery faster than if you force them to download it from your arbitrarily located server.
There are a handful of CDN services comparable to Google’s, but it’s hard to beat the price of free! This benefit alone could decide the issue, but there’s even more.
Increased parallelism
To avoid needlessly overloading servers, browsers limit the number of connections that can be made simultaneously. Depending on which browser, this limit may be as low as two connections per hostname.
Using the Google AJAX Libraries CDN eliminates one request to your site, allowing more of your local content to downloaded in parallel. It doesn’t make a gigantic difference for users with a six concurrent connection browser, but for those still running a browser that only allows two, the difference is noticeable.
Better caching
Potentially the greatest benefit of using the Google AJAX Libraries CDN is that your users may not need to download jQuery at all.
No matter how well optimized your site is, if you’re hosting jQuery locally then your users must download it at least once. Each of your users probably already has dozens of identical copies of jQuery in their browser’s cache, but those copies of jQuery are ignored when they visit your site.
However, when a browser sees references to CDN-hosted copies of jQuery, it understands that all of those references do refer to the exact same file. With all of these CDN references point to exactly the same URLs, the browser can trust that those files truly are identical and won't waste time re-requesting the file if it's already cached. Thus, the browser is able to use a single copy that's cached on-disk, regardless of which site the CDN references appear on.
This creates a potent "cross-site caching" effect which all sites using the CDN benefit from. Since Google's CDN serves the file with headers that attempt to cache the file for up to one year, this effect truly has amazing potential. With many thousands of the most trafficked sites on the Internet already using the Google CDN to serve jQuery, it's quite possible that many of your users will never make a single HTTP request for jQuery when they visit sites using the CDN.
Even if someone visits hundreds of sites using the same Google hosted version of jQuery, they will only need download it once!
It's better to use the library that best suits the needs of your application and your development team. A super-lightweight library might save you a few hundred milliseconds of load time, but may end up costing you in development hours if your team has significantly more experience with jQuery/MooTools/Dojo etc.
If new feature implementation and bug fixing is hindered by using a second-rate tool solely to improve load times, your users are ultimately going to suffer.

How does disqus work?

Does anyone know how disqus works?
It manages comments on a blog, but the comments are all held on third-party site. Seems like a neat use of cross-site communication.
The general pattern used is JSONP
Its actually implemented in a fairly sophisticated way (at least on the jQuery site) ... they defer the loading of the disqus.js and thread.js files until the user scrolls to the comment section.
The thread.js file contains json content for the comments, which are rendered into the page after its loaded.
You have three options when adding Disqus commenting to a site:
Use one of the many integrated solutions (WordPress, Blogger, Tumblr, etc. are supported)
Use the universal JavaScript code
Write your own code to communicate with the Disqus API
The main advantage of the integrated solutions is that they're easy to set up. In the case of WordPress, for example, it's as easy as activating a plug-in.
Having the ability to communicate with the API directly is very useful, and offers two advantages over the other options. First, it gives you as the developer complete control over the markup. Secondly, you're able to process comments server-side, which may be preferable.
Looks like that using easyXDM library, which uses the best available way for current browser to communicate with other site.
Quoting Anton Kovalyov's (former engineer at Disqus) answer to the same question on a different site that was really helpful to me:
Disqus is a third-party JavaScript application that runs in your browser and injects itself on publishers' websites. These publishers need to install a small snippet of JavaScript code that makes the first request to our servers and loads initial JavaScript loader. This loader then creates all necessary iframe elements, gets the data from our servers, renders templates and injects the result into some element on the page.
As you can probably guess there are quite a few different technologies supporting what seems like a simple operation. On the back-end you have to run and scale a gigantic web application that serves millions of requests (mostly read). We use Python, Django, PostgreSQL and Redis (for our realtime service).
On the front-end you have to minimize your payload, make sure your app is super fast and that it doesn't break in extremely hostile environments (you will be surprised how screwed up publisher websites can be). Cross-domain communication—ability to send messages from hosting website to your servers—can be tricky as well.
Unfortunately, it is impossible to explain how everything works in a comment on Quora, or even in an article. So if you're interested in the back-end side of Disqus just learn how to write, run and operate highly-scalable websites and you'll be golden. And if you're interested in the front-end side, Ben Vinegar and myself (both front-end engineers at Disqus) wrote a book on the topic called Third-party JavaScript (http://thirdpartyjs.com/).
I'm planning to read the book he mentioned, I guess it will be quite helpful.
Here's also a link to the official answer to this question on the Disqus site.
short answer? AJAX, you get your own url eg "site.com/?comments=ID" included via javascript... but with real time updates like that you would need a polling server.
I think they keep the content on their site and your site will only send & receive the data to/from disqus. Now I wonder what happens if you decide that you want to bring your commenting in house without losing all existing comments!. How easy would you get to your data I wonder? They claim that the data belongs to you, but they have the control over it, and there is not much explanation on their site about this.
I'm always leaving comment in disqus platform. Sometimes, comment seems to be removed once you refreshed it and sometimes it's not. I think the one that was removed are held for moderation without saying it.

Should I link to Google API's cloud for JS libraries?

I'm looking for the pros/cons of pulling jQuery & other JS libraries from Google API's cloud as opposed to downloading files and deploying directly.
What say you?
My decision
The likelihood of the lib already cached on the users system is the overriding factor for me, so I'm going with a permalink to googleapis.com (e.g. ajax.googleapis.com/ajax/libs/…). I agree with others here that loss of access to the Google server cloud is a minimal concern.
Con
Users in countries embargoed by the U.S. (e.g. Iran) won't get a response from Google
Pros: It may already be cached on the user's system. Google has big pipes. You don't pay for the bandwidth.
Cons: You now have two different ways for your site to become unavailable: A service interruption on your server or one on Google's server.
I've been looking at the real-world performance of the Google loader for jQuery, particularly, and here's what I've found:
Google's servers are quick and plenty reliable.
They are serving from a CDN, which means if you have a lot of overseas users they'll get much better load times.
They are not serving gzipped files. So they're serving a lot more bytes than they need to.
If you know what you're doing in Apache, Lighttpd, or whatever you're serving files with, you could set your cache headers just like Google's and significantly reduce the amount of data your end user has to download by serving it from your own server. You could also combine your scripts at that point and reduce your overall HTTP requests.
Bottom line: Google's performance is good but not great. If you have many many overseas users then Google is probably better, if your users are mostly US-based and maximum performance is your concern, learn about caching, Etags, gzipping, etc. and serve it yourself.
Pros:
Google's connectivity is probably way better than yours
It's a free CDN (content distribution network)
Your webapp might load faster, since you're using a CDN
Cons:
If/when you need to optimize by repackaging a subset of that third-party JS library, you're on your own, and your webapp might then load slower
In addition to points made by others I'll point out two additional cons:
An additional external HTTP request, so assuming you have a Javascript file of your own (almost certain) that's two minimum instead of one minimum; and
IMHO because jQuery load is async your entire page can load before the library has loaded so the effects that you do on document ready are sometimes visibly noticeable to the user when they are applied. I think this is not a great user experience.
The pros are quite obvious and are in the other answers :
you save bandwidth
google is probably more reliable than your server
probably cached in most browsers (anyone stats on this ?)
But the cons can be very tricky :
If you are using https, you will get an error on most browsers as your certificate isn't valid for google's domain, only yours. This is a major issue for https.
I think what would be cool to do is run A/B tests and see what the latency is to load minified version of jquery from Google's servers vs your server. Hopefully that'll put things into perspective. Chances are the Google server might be faster, but in terms of accepting responsibility of down time, nothing beats hosting it yourself.
Pro:
Google's Ajaxlibs offer a very fine-grained "version control" for the included libraries. You can enforce a certain version (e.g. JQuery 1.3.2) or automatically request the latest version from a certain branch (e.g. JQuery 1.3 series -> would currently deliver 1.3.2, but maybe soon 1.3.3).
The later has definitely has benefits: you'll profit from smaller bugfixes/performance improvements without breaking your scripts/plugins.
Maintaining such a multi-library repository on your own can become quite ressource intensive.
Con:
When afraid of DNS poisoning, or when afraid that some public wireless network might not be trusted, then the non-SSL versions might actually not be served by Google at all, opening up drive-by installation of malware. (But: caching is set to be a full year, so even though many browsers will issue a If-Modified-Since request for cached content when hitting refresh, this might still be a theoretical issue as most users will already have cached the resources while using another network.)
When taking extreme care for your visitors' privacy, you might not want Google to record visits to your site by using their CDN. (Quite theoretical as well, as the same note on caching applies.)

Categories

Resources