Storing document now for multiple pages in external files - javascript

Each of my 10-15 pages has 100-200 lines of js inside $(document).ready()
Would it be wise to combine them all into one external file? I don't understand how that would work. Wouldn't the browser have to check for everything all at once, even the functions that are not on the current page? The second issue would probably be function conflicts.
Please give me some tips on how to handle this.

You can split those lines to several files if your js file becomes too big.
Just note that 100-200 lines is very small. You should minify your code if the size is really important for you.
"We should forget about small efficiencies, say about 97% of the time: premature
optimization is the root of all evil"
Regarding to functions conflicts, use namespaces, and closures and keep your global object clean as possible.

would it be wise? yes.
you give the browser one big cachable chunk that he doesnt have to worry about anymore what improves page loading speed.
put all your onload events in differently named functions and call them from the relevant pages, or reuse functions on different pages if they have to do the same.

There are various things you must look into.
First remove all the js to public cdn where possible for ex jquery
Secondly always minify & combine all js to one file ( same applies for css too)
The advantage here is if you have 10 js file browser has to make ten requests and receive each file separately. Not to mention about the number of requests per domain in mobile devices.
On the other end if all the files are are sent as one file there is only one request . You are right that it will take a lot of time executing (or at least checking all the onready stuffs) but this processing time is much much much lower than the request time that would occur if files are different.

Related

In JavaScript, does a function adds loading time even if it's not called

I was wondering about this because I am planning to load my JS file containing a lot of functions so I can efficiently call them whenever I needed them.
So I can efficiently not create a redundant function on the other page.
And I am wondering if the uncalled functions will increase the page's loading time.
Well, if gathering those function into one JS file will help you more than the loading time frustrates you.
Well that's good.
Even if the file size gets to 2MB, which will also have tens of thousands of code lines. You will not notice the slight loading difference if you have an average internet speed.
It's safe as long as you are not calling all the functions in every load. That will have a great difference.
They do add load time but it's not much if we're talking individual functions.
My question is why do you want to add functions you might use?

Does the number of scripts included in a site "significantly" decrease performance?

Case 1:
On the homepage:
script(src="angular.js")
script(src="ember.js")
script(src="react.js")
script(src="knockout.js")
script(src="backbone.js")
...
script(src="jQuery.js")
script(src="my-project.js")
Case 2:
script(src="jQuery.js")
script(src="my-project.js")
In both cases, my-project.js only uses jQuery functions.
Initial load time aside, will jQuery functions take longer to execute considering that their are more scripts to look through?
If they do, is this time more than a couple ms?
The more HTTP requests your page makes before it is functional, in general, the slower the perceived load time. That's why the first recommendation in the YUI Best Practices for Speeding Up Your Web Site is "minimize HTTP requests." There's a lot of nuance there, but in general use script and CSS combining tools to group your script and styling into a single file each.
Initial load time aside, will jQuery functions take longer to execute considering that their are more scripts to look through?
No, not at all. Once the scripts have run and the functions have been created, there's no difference whatsoever in the cost of calling them.

Asynchronous loading JavaScript functions.

I am building a framework in which I have merged all JavaScript files into one file (minify).
Example:
function A() {} function B() {}
Through minified file i want to load function asynchronous and remove from HTML when its work is done.
Example: load function A when it is required but not function B.
I have seen one framework Require.js but in that it loads JavaScript file asynchronous based on requirement.
Is there any framework which loads JavaScript functions on demand which are defined in same js file.
The downside to concatenation is you get less fine-grained control over what you are including on a given page load. The solution to this is, rather than creating one concatenated file, create layers of functionality that can be included a little more modularly. Thus you don't need all your JS on a page that may only use a few specific functions. This can actually be a win in speed as well, since having just one JS file might not take advantage of the browsers 6 concurrent connections. Moreover, once SPDY is fully adopted, one large file will actually be less performant than more smaller ones (since connections can be reused). Minification will still be important, however.
All that said, it seems you are asking for something a little difficult to pull off. When a browser loads a script, it gets parsed and executed immediately. You can't load the file then... only load part of the file. By concatenating, you are restricting yourself to that large payload.
It is possible to delay execution by wrapping a script in a block comment, then accessing it from the script node and eval()ing it... but that doesn't seem like what you are asking. It can be a useful strategy, though, if you want to preload modules without locking the UI.
That's not how javascript works. When the function's source file is loaded, the function is available in memory. Since the language is interpreted, the functions that are defined would be loaded as soon as the source file was read by the browser.
Your best bet is to use Require.js or something similar if you want to have explicit dependency chains.

Does javascript execute if no function is called?

I have some wordpress pages with javascript code that require javascript file references. For pages that don't call functions within these js file references, there should be no performance impact for including these files (except the file call) right?
-- EDIT in response to #cdhowie --
If only certain pages require these javascript files, is it possible to move them out of the head section and into the page? I've read this is bad practice.
But in theory, this prevents the entire site from having a performance hit for files that are not being utilized?
The referenced JavaScript files will be downloaded (or fetched from the cache) and then be executed by the browser's JavaScript interpreter in both cases. The "JavaScript file references" need to be executed in order to create the variables and functions that you might use, and the browser has no way of knowing ahead of time if you will use them. Further, the included files might actually manipulate the document, and the browser doesn't know this either until it has executed them.
So yes, there will be a performance impact whether or not you call the functions. Whether or not it's significant enough for you to worry about is something you will have to determine. (Always profile your page's loading time before making decisions like this!)
Javascript functions are only executed when you explicitly call them (or implicitly in callbacks and whatnot). The code will however still be interpreted by the browser on each page regardless of functions being called or not.
Edit:
I was wrong to say the performance hit is irrelevant. It really all depends on your exact situation (where the code is coming from, how much code, etc.) and also how much you care about performance in terms of milliseconds.
One possible "performance" issue is if those extra .js files are on your server. If so and you are loading them when it is not needed, you are causing for unneeded traffic and bandwidth in regards to your server.
This will execute, but take up very little cpu time
<script type="text/javascript">
// just a comment
</script>
no functions, just a comment... but it's still "code", still has to be parsed, still has to be checked for syntax errors, etc...

Calling functions when needed

So in my page I have some little scripts which I dont really need to load once you visit the site and in fact the user might not need them at all in their entire session.
Also, according to this: http://code.google.com/speed/page-speed/docs/payload.html#DeferLoadingJS its not a good practise either.
So for example, currently I have everything in 'When dom ready':
$(function() {
// most of the code of which is not needed
});
If I dont place the code inside the Dom ready, its not executable at most of the times. So I thought of doing seperate functions for each snippet.
For exmaple:
function snippet1() {
// Code here
}
and then when I need that snippet, I load it when needed with mouseclick. (Not always a mouselcick, depends what I need to load).
For example:
$('#button1').click(function() {
snippet1();
});
So my question is: Is this the way of loading functions async so you reduce the page load time or is there a better way? I havent read this anywhere my examples, I just thought of it.
Note that I am aware of the asynch loading's but that is not my option here, since I could combine all the functions in just one js file which will be cached, so page load time will be less than loading every time asynch js files.
You're mixing several things:
Page load time
JavaScript parsing time - After the script is loaded, it has to be parsed (error checking, compiling to byte code, etc)
Function execution time
You can't do much about the page load time since you don't want to split the script. You may consider to split it into two parts: One which you always need and an "optional" part which is rarely needed. Load the rare functions in the background.
Note that page load times become pretty moot after the site has been visited once and you've made sure the browser can cache the files.
If you want to reduce parse times, you have two options:
Don't load parts that you don't need.
Compress the scripts. Google has a great tool for that: the Closure Compiler. Besides making your scripts faster, it will also check for many common mistakes.
The last part is the execution times. These are only relevant if the functions are called at all and when they do a lot. In your case, I guess you can ignore this point.
Yes, as much as possible you should define objects, functions, etc. outside of the document.ready wrapper. Some devs will define absolutely everything outside the wrapper and then just call an init() function inside the wrapper to load everything else. I am one such dev.
As for async, this doesn't do true async loading, but it speeds up your page since there is much less work to do on page load.
In general, if you're not using a script loader like requireJS or yepnope, it's a good idea to put all your script references – or at least those that don't need to be run instantly – at the end of your body so the page renders before the resources that aren't going to be run until after page load anyway.
I would load all additional resources using RequireJS ( http://requirejs.org/ ) or similar library.
Put everything that you don't need immediately to separate script and load it after main content is loaded.

Categories

Resources