Where do you include the jQuery library from? Google JSAPI? CDN? - javascript

There are a few ways to include jQuery and jQuery UI and I'm wondering what people are using?
Google JSAPI
jQuery's site
your own site/server
another CDN
I have recently been using Google JSAPI, but have found that it takes a long time to setup an SSL connection or even only to resolve google.com. I have been using the following for Google:
<script src="https://www.google.com/jsapi"></script>
<script>
google.load('jquery', '1.3.1');
</script>
I like the idea of using Google so it's cached when visiting other sites and to save bandwidth from our server, but if it keeps being the slow portion of the site, I may change the include.
What do you use? Have you had any issues?
Edit: Just visited jQuery's site and they use the following method:
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.3/jquery.min.js"></script>
Edit2: Here's how I've been including jQuery without any problems for the last year:
<script src="//ajax.googleapis.com/ajax/libs/jquery/1.4.3/jquery.min.js"></script>
The difference is the removal of http:. By removing this, you don't need to worry about switching between http and https.

Without a doubt I choose to have JQuery served by Google API servers. I didn't go with the jsapi method since I don't leverage any other Google API's, however if that ever changed then I would consider it...
First: The Google api servers are distributed across the world instead of my single server location: Closer servers usually means faster response times for the visitor.
Second: Many people choose to have JQuery hosted on Google, so when a visitor comes to my site they may already have the JQuery script in their local cache. Pre-cached content usually means faster load times for the visitor.
Third: My web hosting company charges me for the bandwidth used. No sense consuming 18k per user session if the visitor can get the same file elsewhere.
I understand that I place a portion of trust on Google to serve the correct script file, and to be online and available. Up to this point I haven't been disappointed with using Google and will continue this configuration until it makes sense not to.
One thing worth pointing out... If you have a mixture of secure and insecure pages on your site you might want to dynamically change the Google source to avoid the usual warning you see when loading insecure content in a secure page:
Here's what I came up with:
<script type="text/javascript">
document.write([
"\<script src='",
("https:" == document.location.protocol) ? "https://" : "http://",
"ajax.googleapis.com/ajax/libs/jquery/1.2.6/jquery.min.js' type='text/javascript'>\<\/script>"
].join(''));
</script>
UPDATE 9/8/2010 -
Some suggestions have been made to reduce the complexity of the code by removing the HTTP and HTTPS and simply use the following syntax:
<script type="text/javascript">
document.write("\<script src='//ajax.googleapis.com/ajax/libs/jquery/1.2.6/jquery.min.js' type='text/javascript'>\<\/script>");
</script>
In addition you could also change the url to reflect the jQuery major number if you wanted to make sure that the latest Major version of the jQuery libraries were loaded:
<script type="text/javascript">
document.write("\<script src='//ajax.googleapis.com/ajax/libs/jquery/1/jquery.min.js' type='text/javascript'>\<\/script>");
</script>
Finally, if you don't want to use Google and would prefer jQuery you could use the following source path (keep in mind that jQuery doesn't support SSL connections):
<script type="text/javascript">
document.write("\<script src='http://code.jquery.com/jquery-latest.min.js' type='text/javascript'>\<\/script>");
</script>

One reason you might want to host on an external server is to work around the browser limitations of concurent connections to particular server.
However, given that the jQuery file you are using will likely not change very often, the browser cache will kick in and make that point moot for the most part.
Second reason to host it on external server is to lower the traffic to your own server.
However, given the size of jQuery, chances are it will be a small part of your traffic. You should probably try to optimize your actual content.

jQuery 1.3.1 min is only 18k in size. I don't think that's too much of a hit to ask on the initial page load. It'll be cached after that. As a result, I host it myself.

If you want to use Google, the direct link may be more responsive. Each library has the path listed for the direct file. This is the jQuery path
<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.3.1/jquery.min.js"></script>
Just reread your question, is there a reason your are using https? This is the script tag Google lists in their example
<script src="http://www.google.com/jsapi"></script>

I wouldn't want any public site that I developed to depend on any external site, and thus, I'd host jQuery myself.
Are you willing to have an outage on your site when the other (Google, jquery.com, etc.) goes down? Less dependencies is the key.

Pros: Host on Google has benefits
Probably faster (their servers are more optimised)
They handle the caching correctly - 1 year (we struggle to be allowed to make the changes to get the headers right on our servers)
Users who have already had a link to the Google-hosted version on another domain already have the file in their cache
Cons:
Some browsers may see it as XSS cross-domain and disallow the file.
Particularly users running the NoScript plugin for Firefox
I wonder if you can INCLUDE from Google, and then check the presence of some Global variable, or somesuch, and if absence load from your server?

There are a few issues here. Firstly, the async load method you specified:
<script type="text/javascript" src="https://www.google.com/jsapi"></script>
<script type="text/javascript">
google.load('jquery', '1.3.1');
google.setOnLoadCallback(function() {
// do stuff
});
</script>
has a couple of issues. Script tags pause the page load while they are retrieved (if necessary). Now if they're slow to load this is bad but jQuery is small. The real problem with the above method is that because the jquery.js load happens independently for many pages, they will finish loading and render before jquery has loaded so any jquery styling you do will be a visible change for the user.
The other way is:
<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.3.1/jquery.min.js"></script>
Try some simple examples like, have a simple table and change the background of the cells to yellow with the setOnLoadCallback() method vs $(document).ready() with a static jquery.min.js load. The second method will have no noticeable flicker. The first will. Personally I think that's not a good user experience.
As an example run this:
<html>
<head>
<title>Layout</title>
<style type="text/css">
.odd { background-color: yellow; }
</style>
</head>
<body>
<table>
<tr><th>One</th><th>Two</th></tr>
<tr><td>Three</td><td>Four</td></tr>
<tr><td>Five</td><td>Six</td></tr>
<tr><td>Seven</td><td>Nine</td></tr>
<tr><td>Nine</td><td>Ten</td></tr>
</table>
<script src="http://www.google.com/jsapi"></script>
<script>
google.load("jquery", "1.3.1");
google.setOnLoadCallback(function() {
$(function() {
$("tr:odd").addClass("odd");
});
});
</script>
</body>
</html>
You (should) see the table appear and then the rows go yellow.
The second problem with the google.load() method is that it only hosts a limited range of files. This is a problem for jquery since it is extremely plug-in dependent. If you try and include a jquery plugin with a <script src="..."> tag and google.load() the plug-in will probably fail with messages of "jQuery is not defined" because it hasn't loaded yet. I don't really see a way around this.
The third problem (with either method) is that they are one external load. Assuming you have some plugins and your own Javascript code you're up to a minimum of two external requests to load your Javascript. You're probably better off getting jquery, all relevant plug-ins and your own code and putting it into one minified file.
From Should You Use Google's Ajax Libraries API for Hosting?:
As to load times, you're actually
loading two scripts - the jsapi script
and the mootools script (the
compressed version from above). So
that is two connections, rather than
one. In my experience, I found that
the load time was actually 2-3 times
slower than loading from my own
personal shared server, even though it
was coming from Google, and my version
of the compressed file was a couple of
K larger than Google's. This, even
after the file had loaded and
(presumably) cached. So for me, since
the bandwidth doesn't matter much,
isn't going to matter.
Lastly you have the potential problem of a paranoid browser flagging the request as some sort of XSS attempt. It's not typically a problem with default settings but on corporate networks where the user may not have control over which browser they use let alone the security settings you may have a problem.
So in the end I can't really see me using the Google AJAX API for jQuery at least (the more "complete" APIs are a different story in some ways) much except to post examples here.

In addition to people who advices to host it on own server, I'd proposed to keep it on separate domain (e.g. static.website.com) to allow browsers to load it into separate from other content thread. This tip also works for all static stuff, say images and css.

I have to vote -1 for the libraries hosted on Google. They are collecting data, google analytics style, with their wrappers around these libraries. At a minimum, I don't want a client browser doing more than I'm asking it to do, much less anything else on the page. At worse, this is Google's "new version" of not being evil -- using unobtrusive javascript to gather more usage data.
Note: if they've changed this practice, super. But the last time I considered using their hosted libraries, I monitored the outbound http traffic on my site, and the periodic calls out to google servers were not something I expected to see.

I might be old-school about this, but I still frown on hotlinking. Maybe Google is the exception, but in general, it's really just good manners to host the files on your own server.

I will add this as a reason to locally host these files.
Recently a node in Southern California on TWC has not been able to resolve the ajax.googleapis.com domain (for users with IPv4) only so we are not getting the external files. This has been intermittant up until yesterday (now it is persistant.) Because it was intermittant, I was having tons of problems troubleshooting SaaS user issues. Spent countless hours trying to track why some users were having no issues with the software, and others were tanking. In my usual debugging process I'm not in the habit of asking a user if they have IPv6 turned off.
I stumbled on the issue because I myself was using this particular "route" to the file and also am using only IPV4. I discovered the issue with developers tools telling me jquery wasn't loading, then started doing traceroutes etc... to find the real issue.
After this, I will most likely never go back to externally hosted files because: google doesn't have to go down for this to become a problem, and... any one of these nodes can be compromised with DNS hijacking and deliver malicious js instead of the actual file. Always thought I was safe in that a google domain would never go down, now I know any node in between a user and the host can be a fail point.

I just include the latest version from the jQuery site: http://code.jquery.com/jquery-latest.pack.js It suits my needs and I never have to worry about updating.
EDIT:For a major web app, certainly control it; download it and serve it yourself. But for my personal site, I could not care less. Things don't magically disappear, they are usually deprecated first. I keep up with it enough to know what to change for future releases.

Here some useful resource, hope can help you to chose your CDN.
MS has recently add a new domain for delivery Libraries trough their CDN.
Old Format: http://ajax.microsoft.com/ajax/jQuery/jquery-1.5.1.js
New Format: http://ajax.aspnetcdn.com/ajax/jQuery/jquery-1.5.1.js
This should not send all cookies for microsoft.com.
http://www.asp.net/ajaxlibrary/cdn.ashx#Using_jQuery_from_the_CDN_11
Here some statistics about most popular technology used on the web across all technology.
http://trends.builtwith.com/
Hope can help you to choose.

If I am responsible for the 'live' site I better be aware of everything that is going
on and into my site. For that reason I host the jquery-min version myself either on the same server or a static/external server but either way a location where only I (or my program/proxy) can update the library after having verified/tested every change

In head:
(function() {
var jsapi = document.createElement('script'); jsapi.type = 'text/javascript'; jsapi.async = true;
jsapi.src = ('https:' == document.location.protocol ? 'https://' : 'http://') + 'www.google.com/jsapi?key=YOUR KEY';
(document.getElementsByTagName('head')[0] || document.getElementsByTagName('head')[0]).appendChild(jsapi);
})();
End of Body:
<script type="text/javascript">
google.load("jquery", "version");
</script>

I host it with my other js files on my own server, and, that's that point, combine and minify them (with django-compresser, here, but that's not the point) to be served as just one js file, with everything the site needs put into it. You'll need to serve your own js files anyway, so I see no reason to not add the extra jquery bytes there too - some more kbs are much more cheaper to transfer, than more requests to be made. You are not dependent to anyone, and as soon as your minified js is cached, you're super fast as well.
On first load, a CDN based solution might win, because you must load the additional jquery kilobytes from your own server (but, without an additional request). I doubt the difference is noticable, though. And then, on a first load with cleared cache, your own hosted solution will probably always be much faster, because of more requests (and DNS lookups) needed, to fetch the CDN jquery.
I wonder how this point is almost never mentioned, and how CDNs seem to take over the world :)

Related

javascript doesn't work in Cordova project

i have a web project which works find in web. I want to transfer it into phonegap windows phone project .
Everything works fine but in a search option whenever i click in the search option it shows nothing showing a message "We are having trouble to display this message". N:B: this search option works properly in the web.
here is my search code:
<script src="https://code.jquery.com/jquery-2.1.1.min.js"></script>
<script src="js/materialize.min.js"></script>
<script src="js/init.js"></script>
<script>
var c=getCatalogue();
var bestNew=getBestNew();
$("#recherche").click(function(){
var v=$("#search").val();
window.localStorage.setItem("search",v);
if(v!="") routePage("recherche.html?search="+v);
});
</script>
I think problem is when i pass the value to another page that is "search="+v".
When i use if(v!="") routePage("recherche.html); instead of if(v!="") routePage("recherche.html?search="+v); then it works.
Try downloading and using JavaScript imports locally instead of fetching a remote version if not strictly necessary.
See:
<script src="https://code.jquery.com/jquery-2.1.1.min.js"></script>
Also phonegap.js should be included:
<script type="text/javascript" src="phonegap.js"></script>
You will need to move the JQuery source to a local file and update your script tag, like you have with materialize.min.js. Loading library JS from the network is not a good idea as it slows your app's startup down and also will cause it to fail when started in a situation where there is no network access.
Additionally Cordova/PhoneGap's Content Security Policy may be blocking remote script loads for security reasons - you don't state which PhoneGap/Cordova version you are using, but this may be a problem for you in Cordova 5. There's a tutorial on dealing with that here. You can configure around this by adjusting the Content Security Policy meta tag to allow script-src from other than "self" but I wouldn't recommend this.
When running in Cordova/PhoneGap you should also wait for the "deviceready" event before trying to do anything, to make sure that the framework is fully initialized and that you have access to call plugins.
Also instead of loading new pages you should architect your app so that it is a single page app and generates page fragments from templates as needed. Try looking at something like Handlebars for this unless you have another preferred solution. I have a complete demo app that uses this approach that you can look at the source for here.

How can I detect if javascript files are being blocked by IE11 security settings?

I have a site that renders nearly everything through javascript (not my design), and this has caused a lot of issues in Internet Explorer, of course. The recurring issue is that when the user has security set to High, the necessary javascript files get blocked, I believe because they are from another domain. This has something to do with the Drupal setup, I'm not entirely sure, but the important thing to know is that the files are served from a different domain and there's nothing I can do about that.
What my client wants is for an alert to pop up whenever these scripts are getting blocked that tells their users how to change their security settings.
1) If I add a javascript file on the same domain, it shouldn't get blocked, right?
2) Is there a way I can detect what the user's security settings are, or detect if scripts are being blocked using javascript?
There is another way to detecting whether a js script was loaded or not; there could be so many things, it can be their network firewall, os level firewall, browser security settings, the list continues with possibilities.
You can have:
<script src="http://yourdomain.com/the_js_script.js"></script>
<script>if (typeof foo == "undefined") {alert ('error loading script');}</script>
Make sure in the_js_script.js, you'd have
var foo = 'Script loaded successfully';
You can do that for all the js scripts, and alert distinct messages so the user at least knows which scripts were blocked or if they came through.
Setting your IE security set to High, disables all scripts from running in the browser.
The only workaround available is to place a warning message using the noscript html tag.
<noscript>Your browser does not support JavaScript!</noscript>
I would have liked to comment on #unixmiah's answer, but I don't have enough reputation, and I believe this is an answer as well. Unixmiah's solution won't work to alert blocked clients, I think since the alert needs a script to be generated.However, I believe this would work (wouldn't it?):
jQuery(function($) {
$(document).ready(function() {
$("p#jstrap").hide();
});
});
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
<p id="jstrap">You seem to have switched Javascript off.</p>

Hosting phono (jquery softphone plugin) dependencies locally?

This may be too obscure a question, but perhaps someone can spot what I'm doing wrong.
Phono (jquery plugin for javascript/flash-based softphone built on top of Tropo/Voxeo) loads a couple of dependencies from the phono.com servers. Namely,
flensed.js
checkplayer.js
swfobject.js
phono.audio.swf
I would very much like to avoid loading these dependencies from an external server (for obvious reasons) and going by this thread on their forums (which I can't register for because it appears every possible username has been "taken") , it should be possible to host them locally.
Here's a prettified source for the main jquery plugin. Maybe I'm just bad at looking, but I could not find a commented, un-minified version either in their full SDK or on github.
So after changing
base_path: "http://s.phono.com/deps/flensed/1.0/"
and
swf: "http://s.phono.com/releases/" + Phono.version + "/plugins/audio/phono.audio.swf"
... all dependencies seem to load just fine, phono successfully grabs a session ID and chats by SIP appear to be working. When I try to dial out or call the session id/SIP, however, I get a javascript error:
Uncaught TypeError: Cannot call method 'start' of null
referring to line 770 : h.start().
this.$flash.play(g, j); appears to return null or undefined. I suck at javascript and can't figure out why.
EDIT - if anyone would be so adventurous as to try this out, you can just grab their "kitchen sink" demo and slap it up on a server without much hassle.
Okay -- this is ridiculous and I'm an idiot for not catching it sooner.
Flash was trying to load the ringtones off my server at the URL that requires authentication. Unfortunately, flash is not a user with a valid session. Hence, flash was grabbing big handful of nothing. Sorry.
You can download the PhonoSDK and all of the samples (including the kitchen sink demo) and run it on your localhost. Here's the link: http://s.phono.com/releases/PhonoSDK-0.2.zip. It's open source, do you can also fork/contribute to the project as well - https://github.com/phono
I just tried it using Apache on my localhost it worked without editing anything.

Javascript debugging difficult as browser doesn't refresh the scripts!

I'm trying to debug a Javascript written in the Mootools framework. Right now I am developing a web application on top of Rails and my webserver is the rails s that boots WEBrick.
When I modify a particular tree.js file thats called with in one a mootools init script,
require: {
css: [MUI.path.plugins + 'tree/css/style.css'],
js: [MUI.path.plugins + 'tree/scripts/tree.js'],
onload: function(){
if (buildTree) buildTree('tree1');
}
},
the changes are not loaded as the headers being sent to the client are Last Modified: 10 July, 2010..... which is obviously not true since I just modified the file.
How do I get rid of this annoying caching. If I go directly to the script in my browser (Chrome) it doesn't show the changes until I hit refresh, but this doesn't fix my problem when I go back to my application and hit refresh, it still loads the pre-modified script.
This has happen to me also in FF, I think it is a cache header sent by the server or the browser itself.
Anyway a simple way to avoid this problem while in development is adding a random param to the file name of the script.
instead of calling 'tree/scripts/tree.js' use 'tree/scripts/tree.js?'+random that should invalidate all caches.
As frisco says, adding a random number in development does the trick but you will likely find that the problem still affects you production. You want to push new JavaScript changes to your users but can't until their browsers stop caching the file. In order to do this, just get the files mtime and add that as the random string. This will only change when the file is modified and so the JavaScript will be loaded from cache if it has not been changed or it will be loaded from the server, if it has.
PHP has the function filemtime but as I'm not familiar with Ruby, I'm afraid I can't help you further in that direction (sorry!). However, this answer seems to accomplish what you want.
Try the Ctrl+F5 trick. To avoid hitting browser cache.
More info here:
What requests do browsers' "F5" and "Ctrl + F5" refreshes generate?

ajax based webpage - good way to do it?

I build a website focussing on loading only data that has to be loaded.
I've build an example here and would like to know if this is a good way to build a wegpage.
There are some problems when building a site like that, e.g.
bookmarking
going back and forth in
history SEO (since the content is basically not really connected)
so here is the example:
index.html
<html>
<head>
<title>Somebodys Website</title>
<!-- JavaScript -->
<script type="text/javascript" src="jquery-1.3.2.min.js"></script>
<script type="text/javascript" src="pagecode.js"></script>
</head>
<body>
<div id="navigation">
<ul>
<li>Welcome</li>
<li>Page1</li>
</ul>
</div>
<div id="content">
</div>
</body>
</html>
pagecode.js
var http = null;
$(document).ready(function()
{
// create XMLHttpRequest
try {
http = new XMLHttpRequest();
}
catch(e){
try{
http = new ActiveXObject("MS2XML.XMLHTTP");
}
catch(e){
http = new ActiveXObject("Microsoft.XMLHTTP");
}
}
// set navigation click events
$('.nav').click(function(e)
{
loadPage(e);
});
});
function loadPage(e){
// e.g. "link_Welcome" becomes "Welcome.html"
var page = e.currentTarget.id.slice(5) + ".html";
http.open("POST", page);
http.setRequestHeader("Content-Type", "application/x-www-form-urlencoded");
http.setRequestHeader("Connection", "close");
http.onreadystatechange = function(){changeContent(e);};
http.send(null);
}
function changeContent(e){
if(http.readyState == 4){
// load page
var response = http.responseText;
$('#content')[0].innerHTML = response;
}
}
Welcome.html
<b>Welcome</b>
<br />
To this website....
So as you can see, I'm loading the content based on the IDs of the links in the navigation section. So to make the "Page1" navigation item linkable, i would have to create a "Page1.html" file with some content in it.
Is this way of loading data for your web page very wrong? and if so, what is a better way to do it?
thanks for your time
EDIT:
this was just a very short example and i'd like to say that for users with javascript disabled it is still possible to provide the whole page (additionally) in static form.
e.g.
<li>Welcome</li>
and this Welcome.html would contain all the overhead of the basic index.html file.
By doing so, the ajax using version of the page would be some kind of extra feature, wouldn't it?
No, it isn't a good way to do it.
Ajax is a tool best used with a light touch.
Reinventing frames using it simply recreates all the problems of frames except that it replaces the issue of orphan pages with complete invisibility to search engines (and other use agents that don't support JS or have it disabled).
By doing so, the ajax using version of the page would be some kind of extra feature, wouldn't it?
No. Users won't notice, and you break bookmarking, linksharing, etc.
It's wrong to use AJAX (or any javascript for that matter) only to use it (unless you're learning how to use ajax which is diffrent matter).
There are situations where the use of javascript is good (mostly when you're building a custom user interface inside your browser window) and when AJAX really shines. But loading static web pages with javascript is very wrong: first, you tie yourself with a browser that can run your JS, second you increase the load on your server and on the client side.
More technical details:
The function loadPage should be re-written using jquery : $post(). This is a random shot, not tested:
function loadPage(e){
// e.g. "link_Welcome" becomes "Welcome.html"
var page = e.currentTarget.id.slice(5) + ".html";
$.post( page, null, function(response){
$('#content')[0].innerHTML = response;
} );
}
Be warned, I did not test it, and I might get this function a little wrong. But... dud, you are using jQuery already - now abuse it! :)
When considering implementing an AJAX pattern on a website you should first ask yourself the question: why? There are several good reasons to implement AJAX but also several bad reasons depending on what you're trying to achieve.
For example, if your website is like Facebook, where you want to offer end-users with a rich user interface where you can immediately see responses from friends in chat, notifications when users post something to your wall or tag you in a photo, without having to refresh the entire page, AJAX is GREAT!
However, if you are creating a website where the main content area changes for each of the top-level menu items using AJAX, this is a bad idea for several reasons: First, and what I consider to be very important, SEO (Search Engine Optimization) is
NOT optimized. Search engine
crawlers do not follow AJAX requests
unless they are loaded via the
onclick event of an anchor tag.
Ultimately, in this example, you are
not getting the value out of the rich
experience, and you are losing a lot
of potential viewers.
Second, users will have trouble bookmarking pages unless you implement a smart way to parse URLs to map to AJAX calls.
Third, users will have problems properly navigating using the back and forward buttons if you have not implemented a custom client-side mechanism to manage history.
Lastly, each browser interprets JavaScript differently, and with the more JavaScript you write, the more potential there is for losing cross browser compatibility unless you implement a framework that such as jQuery, Dojo, EXT, or MooTools that handles most of that for you.
gabtub you are not wrong, you can get working AJAX intensive web sites SEO compatible, with bookmarking, Back/Forward buttons (history navigation in general), working with JavaScript disabled (avoiding site duplication), accessible...
There is one problem, you must get back to server-centric.
You can get some "howtos" here.
And take a look to ItsNat.
How about unobtrusivity (or how should I call it?)?
If the user has no javascript for some reason, he'll only see a list with Welcome and Page1.
Yes it's wrong. What about users without JavaScript? Why not do this sort of work server-side? Why pay the cost of multiple HTTP requests instead of including the files server-side so they can be downloaded in a single fetch? Why pay the cost of non-JavaScript enabled clients not being able to view your stuff (Google's spider being an important user who'll be alienated by this approach)? Why? Why?

Categories

Resources