External Javascript Timeout - javascript

I have a number of tracking scripts and web services installed on my website and I noticed when one of the services goes down, it still tries to call the external javascript file hosted on a different server. In Firefox, Chrome and other new browsers, there doesn't seem to be any issues when one of the services go down. However, in IE7 and IE8, my pages don't load all the way and time out before everything is displayed. Is there any way to add a time out on these javascript calls to prevent them from breaking my pages when they go down?

You can load them dynamically after page load with JS. If the JS files are on a different server, the browser will still show a "browser busy" indicator when you do that, but the original page will load.
If you can fetch the JS from your own site, you can load it with XMLHttpRequest after page load (or with your favorite JS library's helpers, e.g. jQuery's $.ajax(...)) and then eval it. This way the fetching itself won't show the browser-busy indicator.
To fetch the JS from your own site, you can download it from your tracking provider (which won't be officially supported but usually works) - just remember to refetch new versions every once in a while - or you can create a "forwarding" service on your own site that fetches it from the tracking provider and caches it locally for a while. This way your JS won't be in danger of staleness.
Steve Souders has more information about deferred loading of scripts and browser-busy indicators.

Try adding defer="defer"
The defer attribute gives a hint to
the browser that the script does not
create any content so the browser can
optionally defer interpreting the
script. This can improve performance
by delaying execution of scripts until
after the body content is parsed and
rendered.
Edit
This will prevent those scripts from running until the page loads:
function loadjs(filename) {
var fileref=document.createElement('script');
fileref.setAttribute("type","text/javascript");
fileref.setAttribute("src", filename);
}
window.onLoad = function() {
loadJs("http://path.to.js");
loadJs("http://path.to2.js");
...
}

If you need to load external scripts and you want to enforce a timeout limit, to avoid having a busy indicator running for too long, you can use setTimeout() with window.stop() and, the IE equivalent:
http://forums.devshed.com/html-programming-1/does-window-stop-work-in-ie-1311.html
var abort_load = function() {
if(navigator.appName == "Microsoft Internet Explorer") {
window.document.execCommand('Stop');
} else {
window.stop();
}
};
/**
* Ensure browser gives up trying to load JS after 3 seconds.
*/
setTimeout(abort_load, 3000);
Note that window.stop() is the equivalent of the user clicking the stop button on their browser. So typically you'd only want to call setTimeout() after page load, to ensure you don't interrupt the browser while it's still downloading images, css and so on.
This should be combined with the suggestions made by orip, namely to load the scripts dynamically, in order to avoid the worst case of a server that never responds, resulting in a "browser busy" indicator that's active until the browser's timeout (which is often over a minute). With window.stop() in a timer, you effectively specify how long the browser can try to load the script.
Also note that setTimeout()'s interval is not that precisely interpreted by browsers so round up in terms of how much time you want to allow to load a script.
Also, one counter-indication to using window.stop() is if your page does things like scroll to a certain position via js. You might be willing to live with that but in any case you can make the stop() conditional on NOT having already loaded the content you expected. For example if your external JS will define a variable foo, you could do:
var abort_load = function() {
if (typeof(foo) == "undefined") {
if(navigator.appName == "Microsoft Internet Explorer") {
window.document.execCommand('Stop');
} else {
window.stop();
}
}
};
This way, in the happy path case (scripts do load within timeout interval), you don't actually invoke window.stop().

Related

How to reload javascript without refreshing the page

We have some software which re-creates javascript files. We need to be able to re-load them into the DOM without refreshing the page. This means that the original javascript file (already loaded by the browser) has changed. Usually a refresh of the browser gets around this, but we can't do that due to losing state in other controls.
My searches for this have only returned results about refreshing the browser with javascript, which is not what I want.
Is it possible to reload the javascript file without refreshing the browse with only JavaScript (no JQuery/Angular or any third party library/framework etc)?
If your intention is to provide data or new widgets:
Vanilla AJAX is simple and ubiquitous. Do that.
If you're averse to that, for whatever reason, and if you're pulling new data exclusively from boxes you control, you could also give JSONP a try.
var s = document.createElement('script');
s.src = 'path/to/script-that-injects-data-or-NEW-widgets.js';
document.body.appendChild(s);
If your intention is to "upgrade" the page or replace functionality:
Just don't do this. While you can easily add new scripts, you can't reliably "cleanse" the page of old scripts or variables without a lot of bookkeeping. Unless your scripts are very very very simple, you can expect no small amount of trouble.
A better solution would be to notify the user of upgrades, ask them to refresh the page, and make sure your app can quickly reinitialize to it's previous state.
Ultimately, if you want to "upgrade" the application in-place, you'll want to refresh everything behind the scenes anyway. And that means you need to know how to rebuild everything to match the existing/prior state anyway. And, you could do this. But, it'll be easier to just refresh the page, which better ensures a clean, compatible state without a lot of fuss.
It is possible.
Have a look at lite-server:
Lightweight development only node server that serves a web app, opens it in the browser, refreshes when html or javascript change, injects CSS changes using sockets, and has a fallback page when a route is not found.
It uses BrowserSync internally, which:
... injects a small script into every page which communicates with the server via WebSockets. When an event occurs — such as a file modification or scroll action — the server sends an update notification to all connected devices.
If you cannot use 3rd-party code you can reimplement something like that yourself.
function loadScript(src, callback)
{
var script,
scriptTag;
script = document.createElement('script');
script.type = 'text/javascript';
script.src = src;
script.onload = script.onreadystatechange = function() {
if (!this.readyState || this.readyState == 'complete' )
{
callback();
}
};
scriptTag = document.getElementsByTagName('script')[0];
scriptTag.parentNode.insertBefore(script, sriptTag);
}
Add another < script src="xxx"/ > tag to the bottom of your page. It should load the js and 'override' the objects defined in your previous script.

Application.js script blocked in "queuing" state until images have loaded

Our site has an asynchronously loaded application.js:
<script async="async" src="//...application-123456.js"></script>
Additionally, we have a lot of third party scripts that (1) are asynchronously loaded, and (2) create in turn an async <script> tag where a bigger script is called.
Just to give an example, one of these third party scripts is Google's gpt.js (you can have a quick look to understand how it works).
Our problem is that, while all the third party scripts load asynchronously as expected, the application.js one gets stack in "queuing" status for more than 4 seconds.
I tried to change the script and make it load like the third party ones: create a <script> element, set the "src" attribute and load it:
<script async>
(function() {
var elem = document.createElement('script');
elem.src = 'http://...application-123456.js';
elem.async = true;
elem.type = 'text/javascript';
var scpt = document.getElementsByTagName('script')[0];
scpt.parentNode.insertBefore(elem, scpt);
})();
</script>
but nothing changed.
Then I studied the network cascade in a page of our site that almost doesn't contain images, and I saw that the queuing time was almost zero. I tried the same experiment in pages with different amounts of images, and saw that the queuing time proportionally increases in pages with more images.
I read this in Chrome's network cascade documentation:
QUEUING TIME: The request was postponed by the rendering engine because it's considered lower priority than critical resources (such as scripts/styles). This often happens with images.
Is it possible that for some reason the browser is marking our application.js as "lower priority"? I looked on the web and it seems that nobody has experienced problems with the queuing time. Anybody has an idea?
Thank you very much.
Browsers use a pre-loader to improve network utilisation. This article explains the concept.
In the Chrome Documentation you linked to above, it says the following about queuing:
If a request is queued it indicated that:
The request was postponed by: the rendering engine because it's considered lower priority than critical resources (such as scripts/styles). This often happens with images.
The request was put on hold to wait for an unavailable TCP
socket that's about to free up. The request was put on hold because the browser only allows six TCP connections per origin on HTTP 1.
Time spent making disk cache entries (typically very quick.)
The pre-loader would have retrieved the lightweight resources quickly, such as the styles and scripts, and then queued up the images because, as the criteria above suggests, only 6 TCP connections are permitted per origin. Therefore, this would explain the delay in the total response time.

Force (or ask nicely) to refresh the browser

So I run a site that uses a lot of javascript and ajax. I understand how to make users refresh their browser when the browser loads. But what happens if I need them to refresh their browser after they have loaded the site?
I want to change the ajax that is served to the client to speed up things up, but this is going to cause errors for the users who have not yet refreshed their browser.
The only solution I can come up with is that when a new version of the JavaScript file is required, the site uses a popup that asks the users to force refresh their browsers. (This won't really fix the current version, but would prevent future issues.)
I hate to use a popup for something that I could do automatically. Is there a better way to force updates for the client?
window.location.href = "http://example.com"
replaces the current page with the one pointed to by http://example.com.
You sound like you are having trouble with your JavaScript getting an updated version of the data it loads through Ajax methods, is that correct? For instance, if two Ajax calls try to load 'data.txt', then the second call merely uses the cached version.
You also may be having trouble with loading new versions your script itself.
The way around both of these problems is to add a randomly-generated query string to your script source and your Ajax source.
For example, make one script that loads your main script, like this:
/* loader1.js */
document.write('<script src="mainjavascript.js?.rand=', Math.random(), '"></script>');
And in your HTML, just do
<script src="loader1.js"></script>
The same method works for JavaScript Ajax requests as well. Assuming that "client" is a new XMLHttpRequest() object, and has been properly set up with a readystatechange function and so on, then the you simply append the same query string, like this:
request = client.open('GET', 'data.txt?.rand=' + Math.random(), true);
request.send();
You may be using a library to do your Ajax requests, and so it's even easier then. Just specify the data URL as 'data.txt?.rand=' + Math.random() instead of merely 'data.txt'

forcing page load on an AJAX-loaded page

I have an issue with an third-party integration on an iPad-specific website, which has a number of pages loaded via AJAX.
When I go to the page for the first time the functionality that is expected to be available is not, and only when I do a page refresh in Safari do I see the feature.
In the 3rd party JavaScript there is this sort of code peppered throughout:
script.onload = script.onreadystatechange = function () { // do something }
Here is the full JavaScript included file.
Is there a way that I can either force a page load on the iPad or build in some workaround that means that when I change to the page where the JavaScript is included and fires?
As I mentioned, this is only apparent on an iPad-specific website and the same feature has no problem on a desktop browser where the page is not loaded via AJAX.
i believe web servers allow you to add content dynamically for all pages rendered using the web server. which allows you to insert a code snippet which can ideally check if its a ipad website and do page load as you requested.
follow the below thread
http://www.microsoft.com/technet/prodtechnol/WindowsServer2003/Library/IIS/e27f918e-89a9-45a8-8604-2ad2ded09d64.mspx?mfr=true
I have no idea what your code looks like, but, having experienced the same issues repeatedly with jQuery, I would suggest you manually call the initialize function of the 3rd party script within a window ready state function:
$(window).ready(function(e) {
// function that initializes the 3rd party script gets called here.
});

How can I monitor the rendering time in a browser?

I work on an internal corporate system that has a web front-end using Tomcat.
How can I monitor the rendering time of specific pages in a browser (IE6)?
I would like to be able to record the results in a log file (separate log file or the Tomcat access log).
EDIT: Ideally, I need to monitor the rendering on the clients accessing the pages.
The Navigation Timing API is available in modern browsers (IE9+) except Safari:
function onLoad() {
var now = new Date().getTime();
var page_load_time = now - performance.timing.navigationStart;
console.log("User-perceived page loading time: " + page_load_time);
}
In case a browser has JavaScript enabled one of the things you could do is to write an inline script and send it first thing in your HTML. The script would do two things:
Record current system time in a JS variable (if you're lucky the time could roughly correspond to the page rendering start time).
Attach JS function to the page onLoad event. This function will then query the current system time once again, subtract the start time from step 1 and send it to the server along with the page location (or some unique ID you could insert into the inline script dynamically on your server).
<script language="JavaScript">
var renderStart = new Date().getTime();
window.onload=function() {
var elapsed = new Date().getTime()-renderStart;
// send the info to the server
alert('Rendered in ' + elapsed + 'ms');
}
</script>
... usual HTML starts here ...
You'd need to make sure that the page doesn’t override onload later in the code, but adds to the event handlers list instead.
As far as non-invasive techniques are concerned, Hammerhead measures complete load time (including JavaScript execution), albeit in Firefox only.
I've seen usable results when a JavaScript snippet could be added globally to measure the start and end of each page load operation.
Have a look at Selenium - they offer a remote control that can automatically start different browsers (e.g. IE6), load pages, test for specific content on the page. At the end reports are generated that also show the rendering times.
Since others are posting answers that use other browsers, I guess I will too. Chrome has a very detailed profiling system that breaks down the rendering time of the page and shows the time it took for each step along the way.
As for IE, you might want to consider writing a plugin. There seems to be few tools like this on the market. Maybe you could sell it.
On Firefox you can use Firebug to monitor load time. With the YSlow plugin you can even get recommendations how to improve the performance.

Categories

Resources