I know both ie and firefox have limits for javascript execution (Source 1, Source 2). Based on number of statements executed, I heard it was 5 million somewhere in IE and based on number of seconds in firefox: it's 10 seconds by default for my version.
The thing I don't get is what cases will go over these limits:
I'm sure a giant loop will go over the limit for execution time
But will an event hander go over the limit, if itself it's execution time is under the limit but if it occurs multiple times?
Example:
Lets say I have a timer on my page, that executes some javascript every 20 seconds. The execution time for the timer handler is 1 second. Does firefox and ie treat each call of the timer function seperatly, so it never goes over the limit, or is it that firefox/ie adds up the time of each call so after the handler finishes, so after 200 seconds on my site (with the timer called 10 times) an error occurs even though the timer handler itself is only 1 second long?
The following article by Nicholas C. Zakas discusses how and when different browsers interrupt long running JavaScript code:
What determines that a script is long-running?
Breaking long processing code into small chunks and launching them with timers is in fact one way to get around this problem. The following Stack Overflow post suggests a method to tackle this:
Show javascript execution progress
On the other hand, web workers would be more suited for long running processing, since their execution happens in a separate process, and therefore does not block the UI thread:
Mozilla Dev Center: Using web workers
John Resig: Computing with JavaScript Web Workers
Nicholas C. Zakas: Experimenting with web workers
However web workers are not supported in Internet Explorer yet, and they would not have access to the DOM.
The event handler is considered a new execution context - the time limit is reset.
If you need to do even more computation, take a look at WebWorkers.
I too stuck in this kind of JS error. I also get the solution, unfortunately only for IE, which tells to edit the registry (regedit command) and update the number of JS commands to allow. Read more here http://support.microsoft.com/?kbid=175500.
But this doesn't seems to be a good solution to me bcoz if you are developing a web application you can not ask every user to use only IE and update his system registry.
As a workaround: If your data to process at client end is really large then you can put some manual delays between fix set of events processing. For ex. If there are 100 events of any task you want to process then you can break them in to 10 sets of 100 and process each of them in say 100ms. To know how to do that follow below link:
http://www.nczonline.net/blog/2009/01/13/speed-up-your-javascript-part-1/
Related
I am creating a WebExtension for browsers. So I found out about the browser.alarms API. It basically allows you to set a (reoccurring or one-time) alarm and a callback will be fired.
Now, we've had such a feature this for a long time in JavaScript as in setTimeout and setInterval. So what is the difference to these ones? Why or in what cases may I prefer the one over the other?
I mean the main difference is obvious: You can only refire it e.g. in minutes, not seconds. Although, I think with when you can also unregister and re-register it in millisecond precision, but I think the API may be intended for longer periods, i.e. minutes. (I am just guessing here.)
So why should I use it instead of a simple setInterval/setTimeout callback?
setTimeout/setInterval time span is limited by 2^31-1 = 2147483647 i.e. ~24 days. Values less than 0 or bigger than that are cast into int32 range, which can produce unexpected results.
setTimeout/setInterval is part of standard DOM, not the isolated world, so when you use it inside a content script, the web page script can clear them accidentally via clearTimeout/clearInterval.
Workaround: post a message to the background script so it sets the timer and sends a response upon finishing.
Event pages (those that have "persistent": false in manifest.json) won't wait for setTimeout / setInterval before unloading due to inactivity and won't wake up for such a timer so you can only use them for a very short time (currently the event pages are guaranteed to live for 5 seconds).
Within these designated limits you can safely use setTimeout/setInterval.
In addition to what has been posted, the alarm API seems to be more reliable:
if you setup alarm in the past, it will fire right away
if you setup future date and your PC wakes up from hibernation after the date, it will fire right after waking up
if you setup 8 hours from now, it will fire 8 hours from now (if your PC is on) no matter how long was your PC sleeping / hibernating in the meantime
See https://discourse.mozilla.org/t/how-reliable-are-alarms/40978/8?u=rugkx, thanks to Juraj Masiar from the Mozilla Community.
To quote from the documentation on MDN:
This is like setTimeout() and setInterval(), except that those functions don't work with background pages that are loaded on demand.
Chrome keeps killing the page in the middle of my connect-four browser game when it is running properly. The game is a player vs computer setup and the game itself runs properly and never crashes. The page crashes when I set the number of iterations too high for training the computer opponent. The programs trains the ai using a qLearning algorithm where it plays itself and stores a value for each encountered state. If I set the number of iterations to about 125,000 or less, then everything works fine (except the opponent is not so great). I cannot tell if it is the running time of the loop (would take about 30 minutes to run) that kills the program or something else such as memory constraints for recording states and their corresponding q-values.
How can I get the program to run for more training iterations without chrome killing the page?
You've got a couple of options on how to handle your code.
Option 1: setInterval / setTimeout
As others have suggested, using setInterval or setTimeout can run your code in "chunks" and no one chunk will cause a timeout.
Option 2: setInterval + generators
With deeply nested code, using setTimeout is very difficult to properly re-enter the code.
Read up on generators -- that makes running code in chunks much nicer, but it may take some redesign.
Option 3: webworkers
Webworkers provide another way, depending on what you are calculating. They run in the background and don't have access to the DOM or anything else, but they are great at calcuation.
Option 4: nodejs
Your last option is to move away from the browser and run in another environment such as node.JS. If you are running under Windows, HTA files may be another option.
I need to perform many "setTimeouts" 60 seconds. Basically, I'm creating a database record, and 60 seconds from now, I need to check whether the database record was changed.
I don't want to implement a "job queue" since it's such a simple thing, and I definitely need to check it around the 60 second mark.
Is it reliable, or will it cause issues?
When you use setTimeout or setInterval the only guarantee that you get is that the code will not be executed before the programmed time.
It can however start somewhat later because other code that is being executed when the clock ticks (in other words other code will not be interrupted in the middle of the handling of an event to process a timeout or interval event).
If you don't have long blocking processing in your code it means that timed events will be reasonably accurate. If you are instead using long blocking calls then probably node is not the correct tool (it's designed around the idea of avoiding blocking "synch" calls).
you should try WorkerTimer.js it is more good for handling background processes and more accurate than the traditional setInterval or Timeout.
it is available as a node.js npm package.
https://www.npmjs.com/package/worker-timer
I am creating a web app that allows users to manage a calendar (CRUD events, tasks, reminders etc...)
And I am trying to implement a feature where they will receive a popup reminder x-minutes before the event/task. From my understanding there is really only one way to do this with javascript:
On login, check for any upcoming events in the database (say in the next 12 hours) and create a setTimeout for the next event, when that setTimeout executes, check again for next event and so on...
My question is, will having multiple setTimeouts (10+) running in the background during user interaction slow down the performance of my app?
Is there a better way to handle popup notifications on the client side? Push Notifications? Any suggestions would be greatly appreciated!
My question is, will having multiple setTimeouts (10+) running in the background during user interaction slow down the performance of my app?
In those numbers, no. (Depending on how + the + in 10+ is. I mean, I expect a million probably would be an issue.)
The other approach would be to have a single timer that you use (say, per minute) to check for notifications that should occur as of that minute. E.g.:
function notifyForThisMinute() {
// Notify user of things we should notify them of as of this minute
// ...
// Schedule next check for beginning of next minute; always wait
// until we're a second into the minute to make the checks easier
setTimeout(notifyForThisMinute, (61 - new Date().getSeconds()) * 1000);
}
notifyForThisMinute(); // First call starts process
This depends on the browser (or more specifically, it's javascript engine) and apparently even OS.
Neil Thomas (while working on GMAIL mobile) and John Resig have analyzed timers.
One of the more noticeable things to look out for is how often the timer runs per given time-interval (say every 200ms or once every 10 minutes..).
Thomas:
With low-frequency timers - timers with a delay of one second or more - we could create many timers without significantly degrading performance on either [an Android G1 or iPhone 3G]. Even with 100 timers scheduled, our app was not noticeably less responsive. With high-frequency timers, however, the story was exactly the opposite. A few timers firing every 100-200 ms was sufficient to make our UI feel sluggish.
Thomas:
Keep in mind that this code is going to execute many times every second. Looping over an array of registered callbacks might be slightly "cleaner" code, but it's critical that this function execute as quickly as possible. Hardcoding the function calls also makes it really easy to keep track of all the work that is being done within the timer.
Resig:
Once you start moving into the range of 64-128 simultaneous timers, you’re pretty much out of luck in most browsers.
One might also have a look at Chronos
I wonder if this code is going to put load on the client since the timeout is so long?
//and update this again in a bit
setTimeout(function() {
updateWeather(lat,lng);
}, 60000);
Not that code alone. The system idles for that one minute. As long as updateWeather doesn't have severe performance issues and the interval is short, setTimeout won't be a product (and I believe you mean setInterval, not setTimeout for recurring checks)
The 60 second timer is implemented by magic in the OS: it basically adds no CPU load during the 60 seconds it is waiting.
I guess updateWeather() is polling an external resource, so the answer to your question is a simple "no, it is fine". (As the weather does not change that often, I'd make it 5 minutes instead, see comment below on battery life.) (Even better: see if the weather data provider gives you a field telling you when the next update will be, and use a setTimeout based on that.)
In other situations, for instance if you have been collecting some kind of data for those 60 seconds, and then go and process it in one go, this could cause a peak of heavy load. Which might be noticed by the user (e.g. all animations go jerky once every 60 seconds). In that case it is better to use a 5 second timer, and process 5 seconds worth of data at a time.
Conversely, if you are using the network on a mobile device, you have to consider battery life: wake-up-and-find-a-connection can dominate. So extending a polling time from 60 seconds to 120 seconds can literally double battery life. Ilya Grigorik has a very good chapter on this in his book, High Performance Browser Networking, http://shop.oreilly.com/product/0636920028048.do
...here it is: http://chimera.labs.oreilly.com/books/1230000000545/ch08.html#ELIMINATE_POLLING
One thing to keep in mind, in addition to the other answers, is the affect that the closure of in-scope variables might have on memory performance. If you've got a gajillion objects being referenced by variables in the scope(s) above your setTimeout callback, those objects may live as long as the callback is queued. That is not something we can tell from the code you posted and so no answer can be definitively given to your question. Also, if your setTimeout is being called multiple times, it will create a closure for each one, effectively duplicating the environment and taking up your heap space.
The answer to your question depends on:
The device you execute this on
The environment - how many times do you execute this
A standalone setTimeout would not hurt the browser; however, if you are doing the same thing repeatedly, it would make a difference.
Have a look at the following website for more insights...
http://ejohn.org/blog/analyzing-timer-performance/