Simple question really, I'm running a bunch of timeouts but wanna make sure they don't slow the page down and that for some reason they aren't kept in memory after they've executed.
$projects.each(function(index) {
var $this = $(this);
window.setTimeout(function() {
// animate
}, 300 * index);
});
// Clear timeouts?
My guess is that they're destroyed once they've run but just want to follow best practice.
No, you don't. Interval timers (via "setInterval()"), yes, if you want them to stop.
It's harmless to clear a timeout that doesn't have to be cleared. That is, if you do clear one after it has run, browsers won't complain.
Related
I'm writing a "Game of Life" in javascript. I have all the logic done in a function called doGeneration(). I can repeatedly call this from the console and everything goes as planned, however, if I put it in a while loop the execution blocks the UI and I just see the end result (eventually).
while (existence) {
doGeneration();
}
If I add a setTimeout(), even with a generation limit of say 15, the browser actually crashes (Canary, Chrome).
while (existence) {
setTimeout(function() {
doGeneration();
},100);
}
How can I call doGeneration() once every second or so without blocking the DOM/UI?
You want setInterval
var intervalId = setInterval(function() {
doGeneration();
}, 1000);
// call this to stop it
clearInterval(intervalId);
I would use requestAnimationFrame(doGeneration). The idea of this function is to let the browser decide at what interval the game logic or animation is executed. This comes with potential benefits.
https://hacks.mozilla.org/2011/08/animating-with-javascript-from-setinterval-to-requestanimationframe/
Rather than using setINterval or setTimeout and assume some random time interval will be enough for the UI to update you shoul/could make the doGeneration smart enough to call itself after dom was updated and if the condition of existence is satisfied.
I'm prerendering my HTML pages for the search engines bots via PhantomJS through Selenium, so that they can see the fully loaded content. Currently, after PhantomJS reached the page, I'm waiting 5 seconds so that I'm sure everything is loaded.
Instead of waiting those 5 seconds every time, one solution I contemplate is to wait until an attribute html-ready on the <body /> tag is set to true:
<html ng-app>
<head>...</head>
<body html-ready="{{htmlReady}}">
...
</body>
</html>
.controller("AnyController", function($scope, $rootScope, AnyService) {
$rootScope.htmlReady = false;
AnyService.anyLongAction(function(anyData) {
$scope.anyData = anyData;
$rootScope.htmlReady = true;
});
})
The question is: will the html-ready attribute always be set to true after any view update has been done (e.g. displaying the anyData)? In other words, is it possible that during a laps, the html-ready attribute is true while the page is not fully loaded yet? If yes, how can it be handled?
It should be done after the digest, thus it has more chances to work as expected.
AnyService.anyLongAction(function(anyData) {
$scope.anyData = anyData;
$timeout(function () {
$rootScope.htmlReady = true;
}, 0, false);
});
But it is useless in terms of the app. You have to watch for changes in every single place, Angular doesn't offer anything to make the task easier.
Fortunately, you are free to abstract from Angular and keep it simple.
var ignoredElements = [];
ignoredElements = ignoredElements.concat($('.continuously-updating-widget').toArray());
var delay = 200; // add to taste
var timeout;
var ready = function () {
$('body').off('DOMSubtreeModified');
clearTimeout(timeoutLimit);
alert('ready');
};
$('body').on('DOMSubtreeModified', function (e) {
if (ignoredElements.indexOf(e.target) < 0) {
clearTimeout(timeout);
timeout = setTimeout(ready, delay);
}
});
var timeoutLimit = setTimeout(ready, 5000);
Feel free to angularify it if needed, though it isn't the production code anyway.
It is a good idea to put the handler into throttle wrapper function (the event will spam all the way). If you use remote requests on the page that can potentially exceed timeout delay, it may be better to combine this approach with several promises from async services and resolve them with $q.all. Still, much better than looking after every single directive and service.
DOMSubtreeModified is considered to be obsolete (it never was really acknowledged, MutationObserver is recommended instead), but current versions of FF and Chrome support it, and it should be ok for Selenium.
Short answer
No. It isn't guaranteed that your markup will be completely rendered when html-ready is set.
Long answer
To the best of my knowledge it's not possible to accurately determine when Angular has finished updating the DOM after the model changed. In general it happens very fast and it doesn't take more than a few cycles to finish, but that's not always the case.
Correctly detecting when a page has finished loading/rendering is actually quite a challenge, and if you take a look at the source code of specialized tools, like prerender, you'll see that they use several different checks in order to try to decide whether a page is ready or not. And even so it doesn't work 100% of the time (Phantom may crash, a request may take longer than usual to complete, and so on).
If you really want to come up with your own solution for this problem, I suggest that you take a look at prerender's source code (or another similar project) to get some inspiration.
When looking to improve a page's performance, one technique I haven't heard mentioned before is using setTimeout to prevent javascript from holding up the rendering of a page.
For example, imagine we have a particularly time-consuming piece of jQuery inline with the html:
$('input').click(function () {
// Do stuff
});
If this code is inline, we are holding up the perceived completion of the page while the piece of jquery is busy attaching a click handler to every input on the page.
Would it be wise to spawn a new thread instead:
setTimeout(function() {
$('input').click(function () {
// Do stuff
})
}, 100);
The only downside I can see is that there is now a greater chance the user clicks on an element before the click handler is attached. However, this risk may be acceptable and we have a degree of this risk anyway, even without setTimeout.
Am I right, or am I wrong?
The actual technique is to use setTimeout with a time of 0.
This works because JavaScript is single-threaded. A timeout doesn't cause the browser to spawn another thread, nor does it guarantee that the code will execute in the specified time. However, the code will be executed when both:
The specified time has elapsed.
Execution control is handed back to the browser.
Therefore calling setTimeout with a time of 0 can be considered as temporarily yielding to the browser.
This means if you have long running code, you can simulate multi-threading by regularly yielding with a setTimeout. Your code may look something like this:
var batches = [...]; // Some array
var currentBatch = 0;
// Start long-running code, whenever browser is ready
setTimeout(doBatch, 0);
function doBatch() {
if (currentBatch < batches.length) {
// Do stuff with batches[currentBatch]
currentBatch++;
setTimeout(doBatch, 0);
}
}
Note: While it's useful to know this technique in some scenarios, I highly doubt you will need it in the situation you describe (assigning event handlers on DOM ready). If performance is indeed an issue, I would suggest looking into ways of improving the real performance by tweaking the selector.
For example if you only have one form on the page which contains <input>s, then give the <form> an ID, and use $('#someId input').
setTimeout() can be used to improve the "perceived" load time -- but not the way you've shown it. Using setTimeout() does not cause your code to run in a separate thread. Instead setTimeout() simply yields the thread back to the browser for (approximately) the specified amount of time. When it's time for your function to run, the browser will yield the thread back to the javascript engine. In javascript there is never more than one thread (unless you're using something like "Web Workers").
So, if you want to use setTimeout() to improve performance during a computation-intensive task, you must break that task into smaller chunks, and execute them in-order, chaining them together using setTimeout(). Something like this works well:
function runTasks( tasks, idx ) {
idx = idx || 0;
tasks[idx++]();
if( idx < tasks.length ) {
setTimeout( function(){ runTasks(tasks, idx); },1);
}
}
runTasks([
function() {
/* do first part */
},
function() {
/* do next part */
},
function() {
/* do final part */
}
]);
Note:
The functions are executed in order. There can be as many as you need.
When the first function returns, the next one is called via setTimeout().
The timeout value I've used is 1. This is sufficient to cause a yield, and the browser will take the thread if it needs it, or allow the next task to proceed if there's time. You can experiment with other values if you feel the need, but usually 1 is what you want for these purposes.
You are correct, there is a greater chance of a "missed" click, but with a low timeout value, its pretty unlikely.
I need some help here..
Is it possible to cancel the chaining delay?
Mn.Base.TopBox.show = function(timedur){
$('#element').fadeIn().delay(timedur).fadeOut();
}
Mn.Base.TopBox.cancelFadeout = function(){
}
I read about queuing and tried some different approaches but I hadn't success...
$('#element').stop();
$('#element').queue('fx', []);
Thanks in advance,
Pedro
It isn't, .delay() doesn't play well with anything else since the timer keeps ticking and a .dequeue() is executed when it's up...regardless of if you cleared the queue and added a whole new one.
It's better to use setTimeout() directly if you intend to cancel, for example:
Mn.Base.TopBox.show = function(timedur){
$('#element').fadeIn(function() {
var elem = $(this);
$.data(this, 'timer', setTimeout(function() { elem.fadeOut(); }, timedur));
});
}
Mn.Base.TopBox.cancelFadeout = function(){
clearTimeout($('#element').stop().data('timer'));
}
What this does is set the timer and store it using $.data(), and when clering the animations, we're both calling .stop() to stop anything in process, and stopping that timer.
There's still the potential here for issues if you're firing this very rapidly, in which case you'd want to switch to storing an array of delays, and clear them all.
I have a function called save(), this function gathers up all the inputs on the page, and performs an AJAX call to the server to save the state of the user's work.
save() is currently called when a user clicks the save button, or performs some other action which requires us to have the most current state on the server (generate a document from the page for example).
I am adding in the ability to auto save the user's work every so often. First I would like to prevent an AutoSave and a User generated save from running at the same time. So we have the following code (I am cutting most of the code and this is not a 1:1 but should be enough to get the idea across):
var isSaving=false;
var timeoutId;
var timeoutInterval=300000;
function save(showMsg)
{
//Don't save if we are already saving.
if (isSaving)
{
return;
}
isSaving=true;
//disables the autoSave timer so if we are saving via some other method
//we won't kick off the timer.
disableAutoSave();
if (showMsg) { //show a saving popup}
params=CollectParams();
PerformCallBack(params,endSave,endSaveError);
}
function endSave()
{
isSaving=false;
//hides popup if it's visible
//Turns auto saving back on so we save x milliseconds after the last save.
enableAutoSave();
}
function endSaveError()
{
alert("Ooops");
endSave();
}
function enableAutoSave()
{
timeoutId=setTimeOut(function(){save(false);},timeoutInterval);
}
function disableAutoSave()
{
cancelTimeOut(timeoutId);
}
My question is if this code is safe? Do the major browsers allow only a single thread to execute at a time?
One thought I had is it would be worse for the user to click save and get no response because we are autosaving (And I know how to modify the code to handle this). Anyone see any other issues here?
JavaScript in browsers is single threaded. You will only ever be in one function at any point in time. Functions will complete before the next one is entered. You can count on this behavior, so if you are in your save() function, you will never enter it again until the current one has finished.
Where this sometimes gets confusing (and yet remains true) is when you have asynchronous server requests (or setTimeouts or setIntervals), because then it feels like your functions are being interleaved. They're not.
In your case, while two save() calls will not overlap each other, your auto-save and user save could occur back-to-back.
If you just want a save to happen at least every x seconds, you can do a setInterval on your save function and forget about it. I don't see a need for the isSaving flag.
I think your code could be simplified a lot:
var intervalTime = 300000;
var intervalId = setInterval("save('my message')", intervalTime);
function save(showMsg)
{
if (showMsg) { //show a saving popup}
params=CollectParams();
PerformCallBack(params, endSave, endSaveError);
// You could even reset your interval now that you know we just saved.
// Of course, you'll need to know it was a successful save.
// Doing this will prevent the user clicking save only to have another
// save bump them in the face right away because an interval comes up.
clearInterval(intervalId);
intervalId = setInterval("save('my message')", intervalTime);
}
function endSave()
{
// no need for this method
alert("I'm done saving!");
}
function endSaveError()
{
alert("Ooops");
endSave();
}
All major browsers only support one javascript thread (unless you use web workers) on a page.
XHR requests can be asynchronous, though. But as long as you disable the ability to save until the current request to save returns, everything should work out just fine.
My only suggestion, is to make sure you indicate to the user somehow when an autosave occurs (disable the save button, etc).
All the major browsers currently single-thread javascript execution (just don't use web workers since a few browsers support this technique!), so this approach is safe.
For a bunch of references, see Is JavaScript Multithreaded?
Looks safe to me. Javascript is single threaded (unless you are using webworkers)
Its not quite on topic but this post by John Resig covers javascript threading and timers:
http://ejohn.org/blog/how-javascript-timers-work/
I think the way you're handling it is best for your situation. By using the flag you're guaranteeing that the asynchronous calls aren't overlapping. I've had to deal with asynchronous calls to the server as well and also used some sort of flag to prevent overlap.
As others have already pointed out JavaScript is single threaded, but asynchronous calls can be tricky if you're expecting things to say the same or not happen during the round trip to the server.
One thing, though, is that I don't think you actually need to disable the auto-save. If the auto-save tries to happen when a user is saving then the save method will simply return and nothing will happen. On the other hand you're needlessly disabling and reenabling the autosave every time autosave is activated. I'd recommend changing to setInterval and then forgetting about it.
Also, I'm a stickler for minimizing global variables. I'd probably refactor your code like this:
var saveWork = (function() {
var isSaving=false;
var timeoutId;
var timeoutInterval=300000;
function endSave() {
isSaving=false;
//hides popup if it's visible
}
function endSaveError() {
alert("Ooops");
endSave();
}
function _save(showMsg) {
//Don't save if we are already saving.
if (isSaving)
{
return;
}
isSaving=true;
if (showMsg) { //show a saving popup}
params=CollectParams();
PerformCallBack(params,endSave,endSaveError);
}
return {
save: function(showMsg) { _save(showMsg); },
enableAutoSave: function() {
timeoutId=setInterval(function(){_save(false);},timeoutInterval);
},
disableAutoSave: function() {
cancelTimeOut(timeoutId);
}
};
})();
You don't have to refactor it like that, of course, but like I said, I like to minimize globals. The important thing is that the whole thing should work without disabling and reenabling autosave every time you save.
Edit: Forgot had to create a private save function to be able to reference from enableAutoSave