Here's the scenario, not sure what I'm missing.
Page A.htm makes an ajax request for page B.htm, and inserts the response into the page.
Page B.htm contains links to several other JS files, many of which contain a document.ready() function to initialize them.
This works fine when A.htm and B.htm are on the same server but not when they are on different servers.
What I think I'm seeing here, is that when page A and B are on different servers (cross domain ajax), the external resources are being returned asynchronously, or at least out of order, so scripts are executing expecting JQuery.UI to be loaded already, when it is not.
Appreciate any pointers or advice. Apologies for the poor explanation.
You are injecting HTML + script tags via jQuery. In this case *:
HTML content except scripts are injected in the document
Then all scripts are executed one by one
If a script is external then it is downloaded and executed asynchronously
Therefore an external or inline script that depends on jQuery UI might execute before jQuery UI.
One possible solution is to change the way your pages work:
Get rid of external scripts in pageb.html but keep inline scripts
Load the required scripts in pagea.html
Load pageb.html
Another solution is to roll your own jQuery function that will:
Strip all <script src> elements from HTML
Download and execute those scripts in order
Inject the remaining HTML
* The exact behavior is not documented. I had to look into the source code to infer the details.
you are correct in your impression that the issue is a difference in how the requests are handled cross-domain.
Here is a link to get you on the right track : How to make synchronous JSONP crossdomain call
However, you will have to actually re-achitect your solution somewhat to check if the resource has been loaded before moving on. There are many solutions (see the link)
You can set a timer interval and check for something in the dom, or another reasonable solution (despite it's lack of efficiency) is to create a "proxy" serverside (eg php) file on your server and have that file do the cross-domain request, then spit out the result.
Note that since jquery UI is a rather large file, it's conceivable that the cross-domain request finishes first, and executes immediately, even though jqueryUI is not loaded yet. In any case, you're going to have to start thinking about having your app react rather than follow a sequence.
Related
I have a Java Web Application, and I'm wondering if the javascript files are downloaded with the HTML-body, or if the html body is loaded first, then the browser request all the JavaScript files.
The reason for this question is that I want to know if importing files with jQuery.getScript() would result in poorer performance. I want to import all files using that JQuery function to avoid duplication of JavaScript-imports.
The body of the html document is retrieved first. After it's been downloaded, the browser checks what resources need to be retrieved and gets those.
You can actually see this happen if you open Chrome Dev Console, go to network tab (make sure caching is disabled and logs preserved) and just refresh a page.
That first green bar is the page loading and the second chunk are the scripts, a stylesheet, and some image resources
The HTML document is downloaded first, and only when the browser has finished downloading the HTML document can it find out which scripts to fetch
That said, heavy scripts that don't influence the appearance of the HTML body directly should be loaded at the end of the body and not in the head, so that they do not block the rendering unless necessary
I'm wondering if the javascript are downloaded with the html body during a request
If it's part of that body then yes. If it's in a separate resource then no.
For example, suppose your HTML file has this:
<script type="text/javascript">
$(function () {
// some code here
});
</script>
That code, being part of the HTML file, is included in the HTML resource. The web server doesn't differentiate between what kind of code is in the file, it just serves the response regardless of what's there.
On the other hand, if you have this:
<script type="text/javascript" src="someFile.js"></script>
In that case the code isn't in the same file. The HTML is just referencing a separate resource (someFile.js) which contains the code. In that case the browser would make a separate request for that resource. Resulting in two requests total.
The HTML document is downloaded first, or at least it starts to download first. While it is parsed, any script includes that the browser finds are downloaded. That means that some scripts may finish loading before the document is completely loaded.
While the document is being downloaded, the browser parses it and displays as much as it can. When the parsing comes to a script include, the parsing stops and the browser will suspend it until the script has been loaded and executed, then the parsing continues. That means that
If you put a call to getScript instead of a script include, the behaviour will change. The method makes an asynchronous request, so the browser will continue parsing the rest of the page while the script loads.
This has some important effects:
The parsing of the page will be completed earlier.
Scripts will no longer run in a specific order, they run in the order that the loading completes.
If one script is depending on another, you have to check yourself that the first script has actually loaded before using it in the other script.
You can use a combination of script includes and getScript calls to get the best effect. You can use regular scripts includes for scripts that other scripts depend on, and getScript for scripts that are not affected by the effects of that method.
I have a few scripts that are common among all html pages for my application. Call this file commonfunctions.js. Each html page will load it as you move around the application along with appending the last modification date for this js file (that's gotten from the server). Firebug is adding the file every time to the list of loaded scripts as well as an eval/seq/# (where # is the number of times this file has been loaded starting at 7 for some reason). For example, if I have 3 pages called one.html, two.html, and three.html each with this line of code:
<script type="text/javascript" src="commonfunctions.js?mod=11/33/2012"></script>
If I were to go from one.html->two.html->one.html->three.html, Firebug would list the scripts loaded as:
commonfunctions.js?mod=11/33/2012
commonfunctions.js?mod=11/33/2012/eval/seq/7
commonfunctions.js?mod=11/33/2012/eval/seq/8
commonfunctions.js?mod=11/33/2012/eval/seq/9
and so on as I visit the three pages more.
Why is this happening and is there a way to stop it? I read that it could be that firebug will make its own url if it doesn't know the url due to an eval() or event attribute; however, these scripts are being loaded via regular tags.
I'm concerned because I'm not sure if this means the browser has now compiled and is executing or storing multiple copies of the same script--very wasteful in both conditions.
The script may have been loaded via script tag, but somewhere within commonfunctions.js a call to eval()has been made. Or three, obviously.
I wonder if anyone has found a way to send at mid rendering the head tag so CSS and Javascript are loaded before the page render has finished? Our page takes about 523ms to be rendered and resources aren't loaded until the page is received. I've done a lot of PHP and it is possible to flush the buffer before the end of the script. I've tried to add a Response.flush() at the end of the Masterpage page_load, but the page layout is horribly broken afterward. I've seen a lot of people using an update panel to send the content using AJAX afterward but I don't quite know what impact it would have on the SEO.
If I don't find a solution I guess I'd have to go the reverse proxy route and find a way to invalidate the proxy cache when the pages content change.
Do not place the Flush on code behind but on your html page as:
</head>
<%Response.Flush();%>
<body >
This can make something like fleekering effect on the page, so you can try to move the flush even a little lower to the page.
Also on Yahoo tips page at Flush the Buffer Early
http://developer.yahoo.com/performance/rules.html
Cache on Static
Additionally you can add client cache on static content like the css and javascript. In this page have all the ways for all iis versions.
http://www.iis.net/ConfigReference/system.webServer/staticContent/clientCache
Follow up
One more think that I suggest you to do after I see your pages is to place all css and javascript in one file each. And also use minified to minimize them.
I use this minified http://www.asp.net/ajaxlibrary/Download.ashx with very good results and real time minified.
Consider using a content-delivery-network (CDN) to host your images, CSS and JS files. Browsers have either an eight or four connection limit per domain - so once you use those up the browser has to wait for the resources to be freed up.
By hosting some files on the CDN you get another set of connections to use concurrently, allowing everything to load faster.
Also consider enabling GZIP on your server if you haven't already. This compresses files on the fly, resulting in smaller transfers.
You could use jQuery to execute your js as soon as it is loaded.
$.fn.ready(function(){
//Your code here
})
Or you could just take the standalone ready function -> $(document).ready equivalent without jQuery
You could do a fade-in or show once the document has been loaded. Just set body display:none;
I have a page that when loaded I would like it to run a perl script. Is this possible to do with javascript? I have never ran a perl script on the web before and the only way I have seen to do it is link to it.
There are 3 ways:
If it's a dynamic page (CGI or other), as long as your script backing the page is an executable Perl script returning valid HTTP response, you're good. If you need to execute a separate Perl script, you can do it using the usual system() call (ideally, the functionality should be a Perl library call so your script can then execute it without spawning a system call). This approach of course works with ANY language that the back-end script is written in, e.g. if it's a Java Servlet, or any other code, it can also execute a system call to your Perl script.
If it's a static HTML page, you can have an "onload" JavaScript event (or just a JS function call inside a <script> tag) which executes an AJAX call to a different dynamic page (say http://yourserver/scripts/run_script_X ) running the script as per previous bullet point.
Just to be clear, the script will be running on the web server and not in a browser/client system; so you need some sort of mechanism for allowing the results of the script to affect your web page if you wish to have such an effect.
$.ajax({url: 'http://yourserver/scripts/run_script_X'}); // jQuery example
As an oldie variation on the last approach, you can also have your page have a <IFRAME> whose URL points to http://yourserver/scripts/run_script_X (make the iframe small/invisible if you don't care about the results of running the script as per your comment).
<!-- The rest of your HTML code -->
<IFRAME
SRC="http://yourserver/scripts/run_script_X"
style="display: none;"
/>
Irelevant comment made prior to the comments on this answer:
Without a lot more context on what you want the page to be (CGI script, static HTML, etc...) and what you want the script to do and how you want the results of running the script to affect your page, it's hard to suggest anything more precise.
I have a section of a webpage that loads a JavaScript file from an external source and then kicks off an Ajax query.
When I load the page, I see the browser saying "waiting for example.com" a lot, so I think the dependency on this external JavaScript is slowing my initial page load.
Is there a way I can load this external JavaScript asynchronously so it doesn't slow the loading of the rest of my page at all?
It's good practice to put JS at the bottom, right above the closing body tag. In addition, use load events window.onload or $(document).ready() to fire your JavaScript after the page has loaded.
As far as loading JavaScript files themself asynchronously or on demand, you could inject it from another JavaScript function or event. But really you are doing the same thing as placing it at the bottom.
Check out the YSlow Guidelines for front-end optimizations.
You could use jQuery's .getScript() method, which is simply a wrapper for an AJAX call.
http://api.jquery.com/jquery.getscript/
This makes the request asynchronous, and gives you a callback that runs after the script has loaded.
You can see my answer here: Dynamic (2 levels) Javascript/CSS Loading
And grab the script from here (see the source). Use it at the bottom, and your scripts will not block other resources (and if you got more than one they will be downloaded in parallel cross-browser).
I wrote a library to asynchronously load javascript files with callbacks for when it loads:
https://github.com/ssoroka/sigma
Sigma.async_script_load('http://example.com/underscore/underscore-min.js', '_', function() {
_([1,2,3,2,3,1]).uniq();
});