I am trying to add my own error handling to the JavaScript setTimeout function. The following code works fine in chrome:
var oldSetTimeout = window.setTimeout;
window.setTimeout = function setTimeout(func, delay) {
var args = Array.prototype.slice.call(arguments, 0);
args[0] = function timeoutFunction() {
var timeoutArgs = Array.prototype.slice.call(arguments, 0);
try {
func.apply(this,timeoutArgs);
}
catch (exception) {
//Do Error Handling
}
}
return oldSetTimeout.apply(this, args);
}
But in IE7 it turns into a recursive function. For some reason oldSetTimeout gets set to the new function.
Any suggestions?
side note: Yes, I need to do it this way. I am using a pile of 3rd party libraries all of which don't deal with setTimeout well, so I can't just change the calls to setTimeout.
This is because you're using named function expressions, which are incorrectly implemented in IE. Removing the function names will fix the immediate problem. See kangax's excellent article on this subject. However, there's another problem that isn't so easily fixed.
In general, it's not a good idea to attempt to override properties of host objects (such as window, document or any DOM element), because there's no guarantee the environment will allow it. Host objects are not bound by the same rules as native objects and in essence can do what they like. There's also no guarantee that a host method will be a Function object, and hence oldSetTimeout may not have always have an apply() method. This is the case in IE, so the call to oldSetTimeout.apply(this, args); will not work.
I'd suggest the following instead:
window.oldSetTimeout = window.setTimeout;
window.setTimeout = function(func, delay) {
return window.oldSetTimeout(function() {
try {
func();
}
catch (exception) {
//Do Error Handling
}
}, delay);
};
Minor improvement to the Answer of Tim Down to mimic the original even more:
window.oldSetTimeout = window.setTimeout;
window.setTimeout = function(func, delay) {
return window.oldSetTimeout(function() {
try {
func();
}
catch (exception) {
//Do Error Handling
}
}, delay);
};
Related
There could be a simple answer for this but I've only ever had to use extension methods (are they even called that in JS?) in C#..
I have a 3rd party library that uses events. I need a function to be called after the event is called. I could do this easily via a promise but the 3rd party library was written before ES6 and does not have promises built in. Here's some sample code -
wRcon.on('disconnect', function() {
console.log('You have been Disconnected!')
})
Ideally I would like to be able to implement something like this -
wRcon.on('disconnect', function() {
console.log('You have been Disconnected!')
}).then(wRcon.reconnect())
So to summarize my question, how do I extend wRcon.on to allow for a promise (or some other callback method)?
Promises and events, while they seem similar on the surface, actually solve different problems in Javascript.
Promises provide a way to manage asynchronous actions where an outcome is expected, but you just don't know how long it's going to take.
Events provide a way to manage asynchronous actions where something might happen, but you don't know when (or even if) it will happen.
In Javascript there is no easy way to "interface" between the two.
But for the example you've given, there is an easy solution:
wRcon.on('disconnect', function() {
console.log('You have been Disconnected!')
wRcon.reconnect()
})
This is perhaps not as elegant as your solution, and breaks down if your intention is to append a longer chain of .then() handlers on the end, but it will get the job done.
You could wrap the connection in a way that would create a promise:
const promise = new Promise((resolve) => {
wRcon.on('disconnect', function() {
resolve();
})
}).then(wRcon.reconnect())
However, this does not seem like an ideal place to use Promises. Promises define a single flow of data, not a recurring event-driven way to deal with application state. This setup would only reconnect once after a disconnect.
Three/four options for you, with the third/fourth being my personal preference:
Replacing on
You could replace the on on the wRcon object:
const original_on = wRcon.on;
wRcon.on = function(...args) {
original_on.apply(this, args);
return Promise.resolve();
};
...which you could use almost as you showed (note the _ =>):
wRcon.on('disconnect', function() {
console.log('You have been Disconnected!');
}).then(_ => wRcon.reconnect());
...but I'd be quite leery of doing that, not least because other parts of wRcon may well call on and expect it to have a different return value (such as this, as on is frequently a chainable op).
Adding a new method (onAndAfter)
Or you could add a method to it:
const original_on = wRcon.on;
wRcon.onAndAfter = function(...args) {
wRcon.on.apply(this, args);
return Promise.resolve();
};
...then
wRcon.onAndAfter('disconnect', function() {
console.log('You have been Disconnected!');
}).then(_ => wRcon.reconnect());
...but I don't like to modify other API's objects like that.
Utility method (standalone, or on Function.prototype)
Instead, I think I'd give myself a utility function (which is not "thenable"):
const after = (f, callback) => {
return function(...args) {
const result = f.apply(this, args);
Promise.resolve().then(callback).catch(_ => undefined);
return result;
};
};
...then use it like this:
wRcon.on('disconnect', after(function() {
console.log('You have been Disconnected!');
}, _ => wRcon.reconnect()));
That creates a new function to pass to on which, when called, (ab)uses a Promise to schedule the callback (as a microtask for the end of the current macrotask; use setTimeout if you just want it to be a normal [macro]task instead).
You could make after something you add to Function.prototype, a bit like bind:
Object.defineProperty(Function.prototype, "after", {
value: function(callback) {
const f = this;
return function(...args) {
const result = f.apply(this, args);
Promise.resolve().then(callback).catch(_ => undefined);
return result;
};
}
});
...and then:
wRcon.on('disconnect', function() {
console.log('You have been Disconnected!');
}.after(_ => wRcon.reconnect()));
And yes, I'm aware of the irony of saying (on the one hand) "I don't like to modify other API's objects like that" and then showing an option modifying the root API of Function. ;-)
Initially, in AngularJS, I had a function I wanted to periodically call. So, I used the $setInterval function to do this. However, upon doing so, I noticed that even when leaving the page, the function continued to run and completely filling my console with errors until I navigated back onto the page associated with the function. I replaced my $setInterval function, with a solution I found here. This guy writes a whole new version of $setInterval. It appears as:
function interval(func, wait, times) {
var interv = function(w,t){
return function(){
if(typeof t === "undefined" || t--> 0){
setTimeout(interv, w);
try{
func.call(null);
}catch(e){
t = 0;
throw e.toString();
}
}
};
}(wait,times);
setTimeout(interv,wait);
};
And I basically call my function as:
interval($scope.setValue, 100);
This function works exactly as I wish, and our prior problem has been solve. But now, readability has become an issue, and I have wondering if there was a way to rewrite this function such that it is easier to read (possible has less code) but functions in the same manner?
Here's a take on it, with comments for clarity which make it seem longer although it isn't:
function interval(func, wait, times) {
// Set `t` to either Infinity or the number of times we're supposed to repeat
var t = typeof times == "undefined" ? Infinity : times;
// The tick function we call
function tick() {
// Set up the next iteration if appropriate
// Note that --Infinity is Infinity, so it will never stop in that case
if (--t > 0) {
setTimeout(tick, wait);
}
// Call the function, canceling further repeats if it throws an exception
try {
func.call(null); // Or just func() seems like it should be good enough, really
} catch(e) {
t = 0;
throw e;
}
}
// Start the process
setTimeout(tick, wait);
}
That preserves a couple of behaviors of the current function which are different from the standard setInterval (I can't speak for Angular's):
It stops trying to call the function if the function fails once
It has a feature letting you specify the max number of times to call the function (which seems handy)
I have several async functions with varying numbers of parameters, in each the last param is a callback. I wish to call these in order. For instance.
function getData(url, callback){
}
function parseData(data, callback){
}
By using this:
Function.prototype.then = function(f){
var ff = this;
return function(){ ff.apply(null, [].slice.call(arguments).concat(f)) }
}
it is possible to call these functions like this, and have the output print to console.log.
getData.then(parseData.then(console.log.bind(console)))('/mydata.json');
I've been trying to use this syntax instead, and cannot get the Then function correct. Any ideas?
getData.then(parseData).then(console.log.bind(console))('/mydata.json');
Implementing a function or library that allows you to chain methods like above is a non-trivial task and requires substantial effort. The main problem with the example above is the constant context changing - it is very difficult to manage the state of the call chain without memory leaks (i.e. saving a reference to all chained functions into a module-level variable -> GC will never free the functions from memory).
If you are interested in this kind of programming strategy I highly encourage you to use an existing, established and well-tested library, like Promise or q. I personally recommend the former as it attempts to behave as close as possible to ECMAScript 6's Promise specification.
For educational purposes, I recommend you take a look at how the Promise library works internally - I am quite sure you will learn a lot by inspecting its source code and playing around with it.
Robert Rossmann is right. But I'm willing to answer purely for academic purposes.
Let's simplify your code to:
Function.prototype.then = function (callback){
var inner = this;
return function (arg) { return inner(arg, callback); }
}
and:
function getData(url, callback) {
...
}
Let's analyze the types of each function:
getData is (string, function(argument, ...)) → null.
function(argument, function).then is (function(argument, ...)) → function(argument).
That's the core of the problem. When you do:
getData.then(function (argument) {}) it actually returns a function with the type function(argument). That's why .then can't be called onto it, because .then expects to be called onto a function(argument, function) type.
What you want to do, is wrap the callback function. (In the case of getData.then(parseData).then(f), you want to wrap parseData with f, not the result of getData.then(parseData).
Here's my solution:
Function.prototype.setCallback = function (c) { this.callback = c; }
Function.prototype.getCallback = function () { return this.callback; }
Function.prototype.then = function (f) {
var ff = this;
var outer = function () {
var callback = outer.getCallback();
return ff.apply(null, [].slice.call(arguments).concat(callback));
};
if (this.getCallback() === undefined) {
outer.setCallback(f);
} else {
outer.setCallback(ff.getCallback().then(f));
}
return outer;
}
This looks like an excellent use for the Promise object. Promises improve reusability of callback functions by providing a common interface to asynchronous computation. Instead of having each function accept a callback parameter, Promises allow you to encapsulate the asynchronous part of your function in a Promise object. Then you can use the Promise methods (Promise.all, Promise.prototype.then) to chain your asynchronous operations together. Here's how your example translates:
// Instead of accepting both a url and a callback, you accept just a url. Rather than
// thinking about a Promise as a function that returns data, you can think of it as
// data that hasn't loaded or doesn't exist yet (i.e., promised data).
function getData(url) {
return new Promise(function (resolve, reject) {
// Use resolve as the callback parameter.
});
}
function parseData(data) {
// Does parseData really need to be asynchronous? If not leave out the
// Promise and write this function synchronously.
return new Promise(function (resolve, reject) {
});
}
getData("someurl").then(parseData).then(function (data) {
console.log(data);
});
// or with a synchronous parseData
getData("someurl").then(function (data) {
console.log(parseData(data));
});
Also, I should note that Promises currently don't have excellent browser support. Luckily you're covered since there are plenty of polyfills such as this one that provide much of the same functionality as native Promises.
Edit:
Alternatively, instead of changing the Function.prototype, how about implementing a chain method that takes as input a list of asynchronous functions and a seed value and pipes that seed value through each async function:
function chainAsync(seed, functions, callback) {
if (functions.length === 0) callback(seed);
functions[0](seed, function (value) {
chainAsync(value, functions.slice(1), callback);
});
}
chainAsync("someurl", [getData, parseData], function (data) {
console.log(data);
});
Edit Again:
The solutions presented above are far from robust, if you want a more extensive solution check out something like https://github.com/caolan/async.
I had some thoughts about that problem and created the following code which kinda meets your requirements. Still - I know that this concept is far away from perfect. The reasons are commented in the code and below.
Function.prototype._thenify = {
queue:[],
then:function(nextOne){
// Push the item to the queue
this._thenify.queue.push(nextOne);
return this;
},
handOver:function(){
// hand over the data to the next function, calling it in the same context (so we dont loose the queue)
this._thenify.queue.shift().apply(this, arguments);
return this;
}
}
Function.prototype.then = function(){ return this._thenify.then.apply(this, arguments) };
Function.prototype.handOver = function(){ return this._thenify.handOver.apply(this, arguments) };
function getData(json){
// simulate asyncronous call
setTimeout(function(){ getData.handOver(json, 'params from getData'); }, 10);
// we cant call this.handOver() because a new context is created for every function-call
// That means you have to do it like this or bind the context of from getData to the function itself
// which means every time the function is called you have the same context
}
function parseData(){
// simulate asyncronous call
setTimeout(function(){ parseData.handOver('params from parseData'); }, 10);
// Here we can use this.handOver cause parseData is called in the context of getData
// for clarity-reasons I let it like that
}
getData
.then(function(){ console.log(arguments); this.handOver(); }) // see how we can use this here
.then(parseData)
.then(console.log)('/mydata.json'); // Here we actually starting the chain with the call of the function
// To call the chain in the getData-context (so you can always do this.handOver()) do it like that:
// getData
// .then(function(){ console.log(arguments); this.handOver(); })
// .then(parseData)
// .then(console.log).bind(getData)('/mydata.json');
Problems and Facts:
the complete chain is executed in the context of the first function
you have to use the function itself to call handOver at least with the first Element of the chain
if you create a new chain using the function you already used, it will conflict when it runs to the same time
it is possible to use a function twice in the chain (e.g. getData)
because of the shared conext you can set a property in one function and read it in one of the following functions
At least for the first Problem you could solve it with not calling the next function in the chain in the same context and instead give the queue as parameter to the next function. I will try this approach later. This maybe would solve the conflicts mentioned at point 3, too.
For the other problem you could use the sample Code in the comments
PS: When you run the snipped make sure your console is open to see the output
PPS: Every comment on this approach is welcome!
The problem is that then returns a wrapper for the current function and successive chained calls will wrap it again, instead of wrapping the previous callback. One way to achieve that is to use closures and overwrite then on each call:
Function.prototype.then = function(f){
var ff = this;
function wrapCallback(previousCallback, callback) {
var wrapper = function(){
previousCallback.apply(null, [].slice.call(arguments).concat(callback));
};
ff.then = wrapper.then = function(f) {
callback = wrapCallback(callback, f); //a new chained call, so wrap the callback
return ff;
}
return wrapper;
}
return ff = wrapCallback(this, f); //"replace" the original function with the wrapper and return that
}
/*
* Example
*/
function getData(json, callback){
setTimeout( function() { callback(json) }, 100);
}
function parseData(data, callback){
callback(data, 'Hello');
}
function doSomething(data, text, callback) {
callback(text);
}
function printData(data) {
console.log(data); //should print 'Hello'
}
getData
.then(parseData)
.then(doSomething)
.then(printData)('/mydata.json');
Whenever an error occurs inside an event handler, it stops code execution entirely so the second event callback isn't called.
For example:
$(function() {
window.thisDoesntExist();
}
$(function() {
//Do something unharmful and unrelated to the first event
}
You can easily solve the problem in this (simplified) example by adding try/catch in both anonymous functions, but in reality these functions often add several other event handlers which in turn would require try/catch. I end up with very repetitive code stuffed with try/catch blocks.
My projects has a modular design where each feature is in a different JS (and gets concatenated during a build process). I'm looking for a more generic way to handle errors inside each feature so that the error doesn't stop code execution of the other features.
I already tried following solutions:
- window.onerror (even if you return true in this function, code execution is stopped)
- $(window).error() => deprecated and code execution stops
You could create a helper function to prevent duplication of the same boilerplate code.
function tryFunction(f, onerror) {
try {
if (typeof f == 'function') {
return f();
}
} catch (e) {
return onerror(e);
}
}
$(function() {
var result = tryFunction(window.thisDoesNotExist, function (error) {
alert('Whoops: ' + error);
});
});
I created a little demonstration. It's slightly different but the same idea.
You can simply call if (typeof myFunction == 'function') before calling myFunction()
And optionally wrap it in a generic function like said by Bart to have the choice to log an error in the console if your function does not exists.
If your webapp is huge with many interaction and JS, too many try catch could alter the global performance of your application.
I would try something like this with a wrapper which will handle the try catch for you (see below, or this jsfiddle : http://jsfiddle.net/TVfCj/2/)
From the way I'm (not, and not really) handling the this and the arguments, I guess it's obvious I'm beginning with js. But I hope you get the idea, and it is correct/useful.
var wrapper = {
wrap: function wrap(f) {
return function (args) {
try {
f.apply(null, args);
} catch (ex){
console.log(f.name+" crashed with args "+args);
};
};
}
};
var f1 = function f1Crashes(arg) {
return window.thisDoesntExist();
};
var f2 = function f2Crashes(arg) {
return window.thisDoesntExist();
};
var f3 = function f3MustNotCrash(arg) {
wrapper.wrap(f1)(arg);
wrapper.wrap(f2)(arg);
}
f3('myarg');
The try-catch pattern you mention attempting in your question is the correct way - you want try-catch blocks, not a way to silently truck through module errors (in general always be extremely careful handling exceptions globally and continuing, that way lies data corruption bugs you only find 6 months later).
Your real problem is this:
... in reality these functions often add several other event handlers which in turn would require try/catch. I end up with very repetitive code stuffed with try/catch blocks.
The fix for that is Promise. This is a new structure, native in most browsers but easily shimmed in the slow ones (ahem, IE), that gives you a standard way of managing both the event callback and the exception from the event.
With a Promise your code makes a promise to always do something: either resolve/succeed or reject/fail.
function moduleA() {
return new Promise(function (resolve, reject)
{
try{
var result = window.thisDoesntExist();
resolve(resolve); // Success!
}
catch(err){
reject(err); // Fail!
}
});
}
This is better because rather than nest try-catch blocks in each callback you can instead chain promises:
moduleA().
then(moduleB).
then(moduleC).
catch(errorHandler); // Catch any error from A, B, or C
You can also handle an error and continue:
moduleA().
catch(continuableErrorHandler). // Catch any error from A
then(moduleB).
then(moduleC).
catch(errorHandler); // Catch any error from B or C
You'll still need lots of try-catch blocks in callbacks, but anything that has been wrapped in a Promise can be treated in the same modular way.
Coming next in JS is async and await, but you can use them now with a transpiler. These use promises to make code that is much easier to read, and most importantly (for you) have a single try-catch at the top that gathers exceptions from the entire Promise chain.
This answer is already too long, but I've blogged about that in more detail.
TL;DR: If your problem is "very repetitive [event callback] code stuffed with try/catch blocks" try using Promise instead.
I found a solution. When using setTimeout, the code is executed in a seperate thread, therefor it won't break any other parts of the webpage.
$(function() {
setTimeout(function() {
window.thisDoesntExist();
}, 0);
});
$(function() {
setTimeout(function() {
//Do something unharmful and unrelated to the first event
alert("This passes")
}, 0);
});
In this example, the second function is run, even when the first one throws an error.
Here's a working example: http://jsfiddle.net/mathieumaes/uaEsy/
In the following code, I intentionally throw an error, but in Chrome (used simply for testing purposes) it does not roll up to the catch. How can I roll up the error into the parent's scope?
try {
setTimeout(function() {
console.log("Throwing Error...");
throw({message:"Ouch!"});
}, 500);
} catch(e) {
console.log(e.message);
}
Chrome replies with:
Uncaught #<Object>
(anonymous function)
Here is the full example I'm working with; when I require "bob" it (intentionally) times out. I want to catch the requirejs error so I could use my application's error system, which is more robust, to notify the learner.
(function() {
try {
var scriptVersion = "1.0.0.1"
window.onload = function() {
var script = document.createElement("script");
script.type = "text/javascript";
script.src = "//content.com/pkg/" + scriptVersion + "/require-jquery.js";
script.async = false;
script.done = false;
// OnReadyStateChange for older IE browsers
script.onload = script.onreadystatechange = function() {
if(!(this.done) && (!this.readyState || this.readyState == "loaded" || this.readyState == "complete")) {
this.done = true;
require.config({
baseUrl: "//content.com/pkg/" + scriptVersion
});
require(["bob"]);
}
}
document.getElementsByTagName("head")[0].appendChild(script);
}
} catch(e) {
console.log(e);
}
})();
See the edit below for how to solve the actual problem with requireJS.
The problem is that the setTimeout() function runs in the parent's scope and completes without error. It schedules (with the system) a future callback event, but when that callback occurs in the future, the parent's scope of execution has finished and the callback is initiated from the system at the top level much like a new system event (e.g. a click event handler).
While the parent closure still exists because the anonymous function inside the setTimeout() can still reference those variables, the actual execution of the parent scope is done, thus the scope of the try/catch is done.
The execution context of the setTimeout() anonymous function is top level (initiated by the system) so there is no parent context that you can put a try/catch in. You can put a try/catch within the anonymous function, but throwing from there will just go back to the system which is what called the setTimeout() callback.
To have your own code catch any exceptions that occur inside the setTimeout() callback, you will need to put a try/catch inside the callback.
setTimeout(function() {
try {
console.log("Throwing Error...");
throw({message:"Ouch!"});
} catch(e) {
console.log(e.message);
}
}, 500);
If you explained what the real problem is that you're trying to solve (rather than this manufactured test case), we may be able to offer some useful options.
Edit now that you've shown what problem you're really trying to solve. The require.js library initiates every error by calling the onError method. The default implementation of the onError method is what throws the exception. You can assign your own onError handler and handle the errors in a callback rather than with exceptions. This sounds like the right way to go.
From the requirejs source:
/**
* Any errors that require explicitly generates will be passed to this
* function. Intercept/override it if you want custom error handling.
* #param {Error} err the error object.
*/
req.onError = function (err) {
throw err;
};
Your throw happens some time after the catch block, when the browser calls the setTimeout callback.
(catch uses logical scoping, not lexical scoping)
The previous answerer explained it correctly.
Another way of thinking about it is that it is not working because setTimeout completes fine and does not throw and exception when it is initially run. It then executes later when you are no longer within the try-catch block.
It will work if you put the try catch inside the setTimeout function like this:
setTimeout(function() {
try {
console.log("Throwing Error...");
throw({message:"Ouch!"});
} catch(e) {
console.log(e.message);
}
}, 500);
Let me know if you still have questions.
Use wrapper function like this.
// wrapper function
var tryable = function(closure, catchCallback) {
closure(function(callback) {
return function() {
try {
callback();
} catch(e) {
catchCallback(e);
}
};
});
};
function throwException() {
throw new Error("Hi");
}
tryable(function(catchable) {
setTimeout(catchable(throwException), 1000);
}, function(e) {
console.log("Error:)", e);
});