I am currently making a sort of web app and part of it works by dynamically loading and adding js scripts to the page. For this I am using JQuery's $.getScript() method which loads it in.
I have it set as cached.
With my program if a script already exists it is loaded again anyway, from what appears to be cache. What I am wondering though is how much would this effect the site and performance. Does a newly loaded script that has the same src as an existing one overwrite the previous one or is the new one added alongside the old one?
Further more as my site is an AJAX site its possible for several scripts from different pages to eventually be loaded up over time. Is there any browser restrictions on how many scripts one can have loaded?
It will affect site performance. Even if your script is cached on the client with expiration set the browser still needs to parse and execute newly included script. More than that, there's a very good chance you will run into javascript errors because you scripts will override variables already set by previous version. JavaScript parsing and executing is still a blocking operation in all browsers, so while your file is being processed your UI will lock up.
To answer second part of the question, as far as I know there's no limit of number of javascript files on a given page. I've seen pages with over 200 javascripts that didn't throw any exceptions.
I think Ilya has provided some great points to consider, and I think that answers your specific question. If what you're trying to accomplish, however, is rerunning a $(document).ready() handler, you could do:
(function(){
var myreadyfunction = function(){
$('#affected').toggleClass('blue');
};
$(document).ready(myreadyfunction);
$(document).ready(function(){
$('button').click(myreadyfunction);
});
})();
http://jsfiddle.net/Z97cm/
I've scoped it into an anonymous (function(){})(); to keep out of the global scope, so you might want to consider that if you need to access that function from outside that scope. But that gives you the general idea of what I was talking about.
Related
I have a plugin that is currently installed on about 30,000 WordPress websites. In some cases, we find customers doing everything in their power to increase their page load speeds. This is certainly a good goal to have as we all know it can affect SEO, user experience and whatnot.
A common problem, however, is that when people move things around or defer them to much, it can create conflicts. In fact, in some cases, I'm actually seeing the DOM loaded event firing off prior to the scripts being loaded. I have a tiny snippet of code that is output via PHP because it is dynamic and contains variables from the server that will be passed to the functions in the JS file.
However, if the DOM loaded event fires off meaning that the files should already be loaded, but they aren't, then I get calls to undefined functions errors. I'm sure you've all seen those before.
So I'm contemplating using a setInterval to check for the existence of my JS files functions before moving forward and then immediately clearing the interval as soon as the function becomes available.
In 90% of cases, the setInterval will never fire off more than just the one time. In the small percentage of cases where they have something unusual going on, it will defer the running of the functions until the code is available.
Here's a make believe example that exemplifies exactly what I'm referring to:
var check_for_my_function = setInterval( function() {
if('function' === typeof my_function ) {
my_function();
clearInterval(check_for_my_function);
}
} , 250 );
Is this a safe and efficient way to do this or is there an actual function that is better suited for accomplishing this kind of delay? I always assumed that I would be safe listening for the DOM loaded event, but even that tends to be premature in some extreme use cases.
I am building a framework in which I have merged all JavaScript files into one file (minify).
Example:
function A() {} function B() {}
Through minified file i want to load function asynchronous and remove from HTML when its work is done.
Example: load function A when it is required but not function B.
I have seen one framework Require.js but in that it loads JavaScript file asynchronous based on requirement.
Is there any framework which loads JavaScript functions on demand which are defined in same js file.
The downside to concatenation is you get less fine-grained control over what you are including on a given page load. The solution to this is, rather than creating one concatenated file, create layers of functionality that can be included a little more modularly. Thus you don't need all your JS on a page that may only use a few specific functions. This can actually be a win in speed as well, since having just one JS file might not take advantage of the browsers 6 concurrent connections. Moreover, once SPDY is fully adopted, one large file will actually be less performant than more smaller ones (since connections can be reused). Minification will still be important, however.
All that said, it seems you are asking for something a little difficult to pull off. When a browser loads a script, it gets parsed and executed immediately. You can't load the file then... only load part of the file. By concatenating, you are restricting yourself to that large payload.
It is possible to delay execution by wrapping a script in a block comment, then accessing it from the script node and eval()ing it... but that doesn't seem like what you are asking. It can be a useful strategy, though, if you want to preload modules without locking the UI.
That's not how javascript works. When the function's source file is loaded, the function is available in memory. Since the language is interpreted, the functions that are defined would be loaded as soon as the source file was read by the browser.
Your best bet is to use Require.js or something similar if you want to have explicit dependency chains.
I have some wordpress pages with javascript code that require javascript file references. For pages that don't call functions within these js file references, there should be no performance impact for including these files (except the file call) right?
-- EDIT in response to #cdhowie --
If only certain pages require these javascript files, is it possible to move them out of the head section and into the page? I've read this is bad practice.
But in theory, this prevents the entire site from having a performance hit for files that are not being utilized?
The referenced JavaScript files will be downloaded (or fetched from the cache) and then be executed by the browser's JavaScript interpreter in both cases. The "JavaScript file references" need to be executed in order to create the variables and functions that you might use, and the browser has no way of knowing ahead of time if you will use them. Further, the included files might actually manipulate the document, and the browser doesn't know this either until it has executed them.
So yes, there will be a performance impact whether or not you call the functions. Whether or not it's significant enough for you to worry about is something you will have to determine. (Always profile your page's loading time before making decisions like this!)
Javascript functions are only executed when you explicitly call them (or implicitly in callbacks and whatnot). The code will however still be interpreted by the browser on each page regardless of functions being called or not.
Edit:
I was wrong to say the performance hit is irrelevant. It really all depends on your exact situation (where the code is coming from, how much code, etc.) and also how much you care about performance in terms of milliseconds.
One possible "performance" issue is if those extra .js files are on your server. If so and you are loading them when it is not needed, you are causing for unneeded traffic and bandwidth in regards to your server.
This will execute, but take up very little cpu time
<script type="text/javascript">
// just a comment
</script>
no functions, just a comment... but it's still "code", still has to be parsed, still has to be checked for syntax errors, etc...
So in my page I have some little scripts which I dont really need to load once you visit the site and in fact the user might not need them at all in their entire session.
Also, according to this: http://code.google.com/speed/page-speed/docs/payload.html#DeferLoadingJS its not a good practise either.
So for example, currently I have everything in 'When dom ready':
$(function() {
// most of the code of which is not needed
});
If I dont place the code inside the Dom ready, its not executable at most of the times. So I thought of doing seperate functions for each snippet.
For exmaple:
function snippet1() {
// Code here
}
and then when I need that snippet, I load it when needed with mouseclick. (Not always a mouselcick, depends what I need to load).
For example:
$('#button1').click(function() {
snippet1();
});
So my question is: Is this the way of loading functions async so you reduce the page load time or is there a better way? I havent read this anywhere my examples, I just thought of it.
Note that I am aware of the asynch loading's but that is not my option here, since I could combine all the functions in just one js file which will be cached, so page load time will be less than loading every time asynch js files.
You're mixing several things:
Page load time
JavaScript parsing time - After the script is loaded, it has to be parsed (error checking, compiling to byte code, etc)
Function execution time
You can't do much about the page load time since you don't want to split the script. You may consider to split it into two parts: One which you always need and an "optional" part which is rarely needed. Load the rare functions in the background.
Note that page load times become pretty moot after the site has been visited once and you've made sure the browser can cache the files.
If you want to reduce parse times, you have two options:
Don't load parts that you don't need.
Compress the scripts. Google has a great tool for that: the Closure Compiler. Besides making your scripts faster, it will also check for many common mistakes.
The last part is the execution times. These are only relevant if the functions are called at all and when they do a lot. In your case, I guess you can ignore this point.
Yes, as much as possible you should define objects, functions, etc. outside of the document.ready wrapper. Some devs will define absolutely everything outside the wrapper and then just call an init() function inside the wrapper to load everything else. I am one such dev.
As for async, this doesn't do true async loading, but it speeds up your page since there is much less work to do on page load.
In general, if you're not using a script loader like requireJS or yepnope, it's a good idea to put all your script references – or at least those that don't need to be run instantly – at the end of your body so the page renders before the resources that aren't going to be run until after page load anyway.
I would load all additional resources using RequireJS ( http://requirejs.org/ ) or similar library.
Put everything that you don't need immediately to separate script and load it after main content is loaded.
Although I am almost certain the answer to this question will be browser specific, do any of the browsers define behavior for when multiple <script> tags are used and have the same src attribute?
For instance...
<script src="../js/foo.js"></script>
...
<!-- what happens here? -->
<script src="../js/foo.js"></script>
The reason I ask this question in the first place, is that in my particular case I am using partial views in an ASP.NET MVC application which make use of JQuery. The JQuery JS file(s) are all included in the master template file via script tags. I would prefer to add script tags to the partial view files so that in case they are used outside the context of the master template, they will automatically include all the necessary JS files, and not rely on another view or template to include them. However, I certainly don't want to cause JS files to have to be transferred to the client multiple times, or any other side effects that could negatively impact the user experience.
My thinking right now is that most, if not all, of the major browsers (FF, Safari, IE, Opera) will cache a JS file the first time it is used, and then on subsequent script tags the browser will use the cached copy if available and if it hasn't expired. However, caching behavior can usually be altered through browser configuration, so it doesn't seem too "safe" to rely on any kind of caching behavior.
Will I just have to accept the fact that my partial views are going to have be dependent on other templates or views including the appropriate JS files?
Even if they're cached, you may have problems since the same code will be executed twice. At the very least, this will cause the browser to take more time than necessary. And it may cause errors, since most JavaScript code isn't written to be executed twice. For example, it may attach the same event handlers twice.
Don't output script tags directly in your partials. Create a mechanism to register script files for later inclusion. That mechanism can be responsible for only including files once.
What happens is that JavaScript is feed into the interpreter the moment it is downloaded. In the event of a namespace collision only the variable name, of a given scope, survives to execution. Normally this last only process prevents problems from arising by overwriting functions feed into the interpreter earlier. The problem is that a function defines variable scope, which those variables could be other functions that introduce then other namespace scopes of variables. That is a problem because if functions share the same name value and include different variable definitions then there could be leakage where variables from a function feed into the interpreter early survive even after that function is overwritten, which can then cause expected namespace collisions.
If the exact same file is included twice there should be no problem. The problem occurs when different versions of the same file are included or different files with the same function names are included. Including the same file twice can mean multiple transmissions, which is a waste of bandwidth.
FF 3.5x, Chrome 4x include it only once.
:) IE 8 has two copies (view in Developer Tools > Scripts tab there are two jquery-1.3.2.min.js entries)