How many javascript link I can have in Facebook? - javascript

Is there a limit of how many js file I can include () in facebook? I include 5 files. no problem.. the 6th one not loaded. Then I have to put the code in the 6th one into the 5th file. then works. so, is it 5 files in max?
BY THE WAY, I'm developing the apps now, not in production. so it is not in the stage of compressing JS / minizing it. :)
so it is kinda annoying to got missing js files or files not loaded..etc
so, what is the limit from FB? what is the file size limit? I know that the JSON request call back data limit is 5000.. but not sure about the js include.

Yes, Facebook should not do anything within an IFrame app. Okay even for an FBML app, there are some things that might cause the issue:
script tags must be ended by a separate closing tag, they do not work like link tags.
Some browsers for instance IE limit the total number of link tags that are included on a page (IE link tag limit seems to be 31). I am not really sure if there is some hard limit like that on the number of script tags too, but I would check that too. Searching on the web, did not yield anything.
While javascript files are loaded and run, every other parallel file download from that domain is disabled, so make sure that some javascript file that you have included is not blocking others from being downloaded.
Also only two-six files are downloaded in parallel from a domain, and as an extra precaution only one javascript file is downloaded at any given time. Hence make sure you do not have an excessively big js file/ unreachable js file that is hogging the pipeline.
To circumvent point 5 you can dynamically script js file downloads, because the hard limit of 2-6 parallel downloads only pertains to HTML scripted files. So using javascript to load additional files will increase the number of parallel downloads that can happen.
From my experience I just concatenated all the 30 or odd files into 3 files and included these 3 files in production environments and continued to use the individual files for development.
It seemed to me that Facebook was timing out somehow and trying to reload the page in the earlier case, when the network latency to download some of the js files was high (cannot prove this theory though).

From a user perspective, you should be loading as few as possible, not trying to see how many you can shove in there. More independent files means more HTTP Requests for the client which means a slower page load.
Also, I'd minify your script before pushing into production, less bytes means a faster page as well.

I have 29 scripts loaded on development environment and just recently have I started seeing issues with the files not being loaded and stuff.
But if you are writing a iframe based app then I cannot reason out why/how facebook will limit the number of JS files loaded. Check out network latency though because, with a very slow connection facebook might try reloading the whole page and in the process refetching all of your static assets unless they are cacheable.
Browser caches can also be a factor. Firefox 3.6 does very aggressive caching and frankly speaking it has more glitches than any thing else. Hence check to see if your incomplete js files are already cached by the browser.

Related

Using separate javascript file for each html page?

Usually, the javascripts of the main page is heavier than other pages. For example, we put jQuery slideshow in the main page which is not used in other pages. Is it necessary to create different s for the main and individual pages to include only in-action javacript files?
Or all javascript files read on the first page will be cached for browsing the website, and in loading an indivitual page, browser will not read the javascript of slideshow?
Another form of this question is: if I put slideshow on each individual page, will the browser load the slideshow javascript file each time, or it will read from its cache (saved on the visitor's computer)?
like florian h says most browsers will cache the content (unless development tools are being used).
if you only use the slideshow javascript on one page I would recommend putting it in a separate file. There is a downside to this, most often the http requests take the longest time with loading a file.
So if you for example have one javascript file of 1mb and you need all the javascript on most pages its better then using 4 smaller files of 250kb each. Because your browser needs to do 4 separate requests.
Ofcourse this maybe is a difference of a couple of milliseconds of performance profit, so you might want to choose to do it in separate files anyway to increase maintainability.
Allmost all browsers will cache the javascript files, so you shouldn't create different versions for sub pages.
But if you have very large JS files it's of course reasonable to only include those that you actually need.
All files are cached in the browser based on the path to the file.
If you include an javascript from one page, the file will be cached and it won't be downloaded again when you surf other pages.
Unless you want it to ;)
Yes, js files will be cached (if not said otherwise).
But, js files must be processed and may include initialization logic that you do not need. Also every script tag that loads external js will block any other "http thread", meaning images, css files... will stop loading untill js file is loaded, otherwise you will have several parallel (at same time) resources loading.
I would have different scripts for different pages.
For your case it might be an issue and it might not be. You should make few test for you case and see whether do you have performance issues. If not than convenience of not having different scripts for different pages might be better.

Speed optimizing a JavaScript function

I have a number of JavaScript functions like the following on my page:
function fun1(){...}
function fun2(){...}
function fun3(){...}
function fun4(){...}
I may use fun1 in one or two pages, but the other functions only for specific pages.
My question is: should I include all the functions in one file like script.js or include specific functions for specific page? Which one is better for speed optimizing?
I guess your question is about optimizing page loading speed.
I would suggest grouping them as mush as possible in a single js file.
Otherwise, you would have to load a lot of small js files, increasing the page loading time.
Consider minifying your JS files too.
Depends on the size of the functions, your visitors' access patterns, your cache settings and other circumstances. The speed of downloading a file depends on how many TCP packets the server has to send. (Packet sizes tend to be around 1,5K.) Increasing the file size only matters if means the file needs to be broken into more packets (the client-size delay of processing a script which needs not be run is negligible), so if your scripts are short (you should of course minify them first), its best to alwaays bundle them. If you expect the average visitor to need all scripts eventually, it's again best to send them in one file. If, however, the average visitor won't need some of the larger scripts (for example one part is only needed at upload, and only 0,1% of the visitors ever uploads something), it might be better to send them separately.
The .js files are cached by your browser. So you can include as many functions as you like in a single file. If you split them into separate files that much of additional calls are made from the browser which slows down the loading page.. Also you can compress the js files if you are concerned about the size of the .js file ..# http://javascriptcompressor.com/
It depends a lot on how your server is sending out these files. If you have Firebug, open up the Net tab and inspect your JS files. If you see a Last-Modified entry in the Headers tab, it means that you are better off putting all your JS into one file. If you don't see it, it's best to split things up into page-specific files.
In my opinion, there are four main methods of speeding up your page-load times:
server headers -- this one is more complex to set up, but if you control your server settings or if you are willing to serve your JS via a dynamic page (PHP or ASP), you can send extra instructions to the browser to cache specific content for specific periods. Since your JS files are likely to change quite infrequently, it's usually pretty safe to do this for them. You basically just need to set the Expires header to some point well into the future. This means that the browser will not need to request the file at all if it has it in the cache. This makes the most sense if you have visitors who come back again and again. If you get a lot of one-hit visitors, this won't make a difference. This does mean that if you change these files, many browsers won't pick up the change; thus you should either change the file name or append something to the query string like this: <script type="text/javascript" src="/sitescript.js?v=1.1"></script>. This can be a maintenance problem if you have more than a few static HTML pages.
numbers of files -- in my opinion, this is where you get the biggest bang-for-buck savings. I'm nearly certain that most browsers still support only four active requests at a time. That means that if your web page has five images, the last image won't get requested until one of the previous images completes loading. And if your site has 50 images and 3 CSS files and 10 JS files, it's going to take a while to clear all those requests. Remember, even if you are sending Last-Modified headers, the browser still needs to check if the content has changed, so it needs one of those request slots. If you can combine all your images into a single images (using CSS sprites) and all your JS into a single file, your pages will load significantly faster.
file size -- as the web speeds up, this gets less and less important. If your server does not support content compression, it's a pretty good idea to minify your JS, though the time savings are overrated in my opinion. This does make maintenance somewhat more time-consuming and live debugging nearly impossible, but it definitely brings file size down quite a bit. If you have a LOT of JavaScript (maybe ~150KB+?) or if you know your visitors are coming from slower networks (for example, people on a corporate network), I would recommend doing it. If your server DOES support compression, the savings are actually negligible.
script placement -- when the browser hits a <script src="..."> tag, it halts all rendering until the script has loaded and executed, which means an inevitable delay. If you put your scripts in the middle of your page, you'll note that half the page loads and then pauses. To speek up rendering, place as many of your <script> references as you can at the dead bottom of the page. Scripts that you need at the top of the page can go there, but the more <script> clutter you have up there, the slower the page will render. Any code that gets executed by onLoad or DOMReady can safely go at the bottom of the page.
Yahoo has a really quite amazing list of optimization tips at their Best Practices page.

Basic caching js assets in Ruby on Rails

I am at the point where I have a bunch of javascript files and I'm not sure how to approach caching them all in one file. I have come across using:
javascript_include_tag ... :cache => true
but I have a number of javascript files that are particular to a specific page...does it make sense to include all of them in my layout even though some pages do not need a lot of the javascript in there? Some of my pages do not require any javascript at all, is a browser going to download this concatenated js for every page?
Some people will dump all their JavaScript into one file, but I don't think that makes a lot of sense unless the routines are used in every page.
Think about how your scripts are used. Put ones that are used most often in the most pages in one file. Then, if there are scripts used occasionally, put them in separate files. Then use multiple <script> statements in your HTML file to pull in the ones you need.
If a user's browser is set normally, it will download the scripts once then reference them from its local cache. The first time they request the page it'll take a bit longer to retrieve everything because it has to populate the cache, but from then it'll be fast(er). The browser will use the cached version for all references to the script.
The :cache => true flag can help if you have a bajillion scripts, because they could be compressed during the first download of the file but I don't think it speeds up loading afterwards when the browser is pulling them from its cache.
Caching multiple javascripts into one talks about it.
n include tags = n get request on the server. This does not perform well and the web page gets slower.
I would not mind minifying everything into one file. Its a one time download anyways and then it gets cached in the browser.
Each situation is different so analyze yours using Yslow and see if minifying into one file is going to help or not. Also look at https://github.com/thumblemonks/smurf for minifying your js & css in to 2 files.

Which is better for JavaScript load-time: Compress all in one big file or load all asynchronously?

A simple question that I'm not sure if it has a short answer!
Description
I have a files of JavaScript that to be loaded in a website here are some notes about them:
They are all comes from the same domain (no cross domain loading needed)
They are identical around the website.
There are several files, like jQuery, and 5 other plugins plus my own application script that is based on them.
Their size all compressed = 224KB, ( I combine all the files in one file then I compress them at once using YUI Compressor 2
Problem
I've heard that 224KB is not ideal to be in one file! and it should be split into several files with maximum of 44KB each .. I can't recall when I've heard this and I'm not sure if it's effective to split it into more files, but It's true that 224KB takes long time to load for the first time, consider that the website is loaded with images and css of course.
I've minimized the need for the early loading of JavaScript file and put it on the bottom, so far this is a good progress but I need to load it assynchounosly with the HTML to gain time Source and the decission to make is:
Yes or No?
Keep it in one compressed big file? or to split them into many compressed file and loaded a asynchronously (I'm aware of handling the dependency related problems)?
It depends on what the site is and how important first load time is for it.
Regardless of that though, I'd probably load JQuery and stuff like that from a public CDN. One big benefit is that it might already be cached even if they have never been to your site.
http://encosia.com/2008/12/10/3-reasons-why-you-should-let-google-host-jquery-for-you/
The Cappuccino team is a big proponent of one file -- they make a javascript framework. Apps made with their tool are expected to have some load time.
http://cappuccino.org/discuss/2009/11/11/just-one-file-with-cappuccino-0-8/
Another benefit of loading JQuery and related from a public CDN would the increased requests by destination. I believe the client is restricted to 2 requests per domain, so by loading jquery from google, and a plugin from jquery, and your custom app code from your own domain, the browser can execute these concurrently rather than waiting for the first two and then issuing a third request.
I guess this adds another performance improvement over one large file as well. Even if you just split that 1 file into 2, it could be retrieved with 2 concurrent requests from the browser potentially improving load time.
Here's what we did to make our web app fast.
The main JS and CSS files are compressed and put inline with the HTML markup.
The white spaces of the HTML are removed and the images are converted to data:image/png by a shell script.
The size is ~400kb but cached and gzipped.
The mobile version of the web app is the same but at ~250kb.
It means the whole app is ready to run, like an executable, in a single http call.
Then a second http call get the data(JSON), and we use PURE to render it in HTML using the existing markups in the page as templates.
The app is divided in modules, only the common modules are preloaded this way.The others are coming when requested by the user.
There is no exact answer to this question. It pretty much depends on how and when you are making use of those files.
Typically, you only want to download JS files on pageload which are universally required by the web app. Module specific or page specific JS files shouldn't be compressed in the main JS download and would ideally be loaded on demand.
Also, this question is valid only if you are concerned about user experience for first time users. The JS files would be cached anyways for every other visit.

Javascript file inclusion in html pages- what happens underneath in the browser?

I think this may be a browser dependent question- Suppose I have 10 Javascript files and several HTML pages. Suppose HTML pageA needs only JS1.js and JS3.js, similarly HTML pageB needs JS4.js and JS1.js. I want to know what would be effect of including all the 10 javascript files in all HTML pages? Will it directly relate to the memory consumption by the browser?
I am facing this problem particularly with YUI javascript library. There are several components like datatable, event, container, calendar, dom-event etc., The order in which they are included also seems to matter a lot- For example the dom-event js should be included before the rest for it to work. So to avoid all this confusion, I thought of including all these js files in a header file that gets included in all HTML pages.
The thing that I am worried about is the memory bloat and performance problems that it may cause. Please provide your suggestions on the same..
Thanks,
-Keshav
Any script you load into your page, even once downloaded and cached must still be parsed before the rest of the page can load. So in that sense there is a memory penalty, and there's still a potential for something in the script to significantly delay rendering.
However, in the case of a conscientiously designed library such as YUI I would expect the parsing time to be minimised.
If you can load all your scripts in at the end of the page, that can vastly improve performance as the entire page can render before being blocked by javascript execution, and your site will feel a lot snappier.
I would suggest investigating the Firebug Net panel and the YSlow extension to get specific performance stats for your website.
External scripts delay the display of the following html until they have loaded and executed. The impact is much less after the first page load, since they're already cached, though browsers will occasionally check for new versions, which still carries a delay. I try to limit the number of scripts and move the script tags to the bottom of the page when possible. Users won't notice the script loading delay if the page has already fully displayed.
if a given script does nothing, it will not affect the performance.
Obviously the first page will load slowly, but the rest will not need to load all the scripts because they will be cached. So the next pages will load faster
Tips:
1) Load the script at the bottom of the page (just before the closing BODY tag).
2) Use a non-blocking way of loading the scripts. This is the one I'm using .
<script type="text/javascript">
function AttachScript(src) {
var script = document.createElement("script");
script.type = "text/javascript";
document.getElementsByTagName("body")[0].appendChild(script);
script.src = src;
}
AttachScript("/js/jquery.min.js");
AttachScript("/js/ndr.js");
AttachScript("/js/shadowbox.js");
AttachScript("/js/libraries/sizzle/sizzle.js");
AttachScript("/js/languages/shadowbox-es.js");
AttachScript("/js/players/shadowbox-img.js");
AttachScript("/js/adapters/shadowbox-jquery.js");
Can't find the source web page though :-(
Memory Consumption:
Assuming the scripts are well written then memory consumption and performance issues should be nominal. Your biggest problem with including all scripts at once will be the latency in the user experience first time through, or if you make changes, because they will have to download all of them in one hit. I think you should only include the scripts you need per page, not all scripts at once.
You can assess the impact yourself using simple tools like task manager/processes in Windows to monitor memory/processor useage, or plugs ins like Firebug for FireFox.
You can also look into something called minification to help make your script files as small as possible.
Dependencies:
The order in which you include the scripts is important as some scripts may depend on functionality in other scripts. So if the code in one script attempts to run and it requires code in another script that has not been downloaded then it will fail. My advice would be to actually understand those dependancies in your scripts files rather than just downloading everything at once because it seems easier.
Use the YUI Configurator to help determine the required file includes and order, as well as how to use the Yahoo! CDN combo service to combine all YUI files into a single script tag.
http://developer.yahoo.com/yui/articles/hosting/
External assets to the HTML page are typically cached by the browser. External assets are anything requested from the HTML such as images, CSS, JavaScript, and anything else. So if you load all 10 script files up front you are forcing a one time massive download hit to your user. After this one time the user does not need to download the scripts again unless the modify timestamp on the files change.
Your page will only use what it requires. If a particular page requests js4.js and js5.js then all the functions in those files will be loaded into the interpreter in the order in which they are first requested from the HTML and second by the order in which they are specified in each of those files. If there are any namespace conflicts what ever is loaded into the interpreter last wins. The interpreter will clear out the functions once the page is unloaded from the browser.
For efficiency I would suggest using a server-side inclusion process to read each of the js files and include the contents of each file into a new single js file. This will reduce the number of HTTP requests to the server and save your users an extreme amount of bandwidth resources with regard to HTTP headers and GET requests. Also, put the request of this new one script file directly prior to the closing body tag of your HTML. Downloading of scripts block parallel downloads in IE, so you want to load scripts at the lowest possible point in the page.
Scriptaculous implements a nice way to handle js dependencies. Guess you could check it out and "re-implement" it. ;D
As for memory bloat and performance issues... as long as your JS doesn't leak a lot (YUI probably doesn't) memory won't be much of a problem, although it will make your pages load slower, especially if loaded in the header.
You can read on caching methods using PHP to pass on several javascript files as one big JS file which includes everything you need. For additional performance gains, you can make the browser cache the file locally in addition to sending it gzipped (if the browser has support for the encoding using something like ob_start("ob_gzhandler");). By using gzip encoding, you can severely reduce the filesize of the main JS file you're sending which includes all your JS code (since plain text compresses so well). I recently had to do this on my own website and it's worked like a charm for both JS and CSS files.
http://www.ejeliot.com/blog/72
Note that by following the instructions on that tutorial, your JS file will only be sent once and the browser on the client's machine will keep a local copy stored which will also improve performance of every visit thereafter.
Also, consider googling "Minify" which should be hosted on Google Code.

Categories

Resources