I've got multiple elements on my page that fade in and out on a timer using javascript setInterval to set them in motion. I have them delayed so they are offset just slightly to create a nice cascading effect, but if you leave the page open long enough, they all catch up to one another and the timing gets all messed up (you've got to leave it for a few minutes).
I have an ugly example of the issue at CodePen here: http://www.cdpn.io/wgqJj
Again, you've got to leave the page open and untouched for a few minutes to see the problem. If you had more items on the page (5 or 10) the problem becomes even more apparent. I've also used this type of effect with several jQuery photo rotator plugins, and over time, the issue always crops up.
Is there any explanation for this?
Here is the code I'm using (I know the javascript could be cleaner):
HTML:
<p id="one">First</p>
<p id="two">Second</p>
<p id="three">Third</p>
JavaScript:
$(document).ready(function() {
var timer1 = setTimeout(startOne,1000);
var timer2 = setTimeout(startTwo,2000);
var timer3 = setTimeout(startThree,3000);
});
function startOne () {
setInterval(flashOne,3000);
}
function startTwo () {
setInterval(flashTwo,3000);
}
function startThree () {
setInterval(flashThree,3000);
}
function flashOne () {
$("#one").fadeTo("slow",0.4).fadeTo("slow",1.0);
}
function flashTwo () {
$("#two").fadeTo("slow",0.4).fadeTo("slow",1.0);
}
function flashThree () {
$("#three").fadeTo("slow",0.4).fadeTo("slow",1.0);
}
Question has already been answered here. Quoting from the top rated answer in this topic:
it will wait AT LEAST 1000MS before it executes, it will NOT wait exactly 1000MS.
Giving an actual answer, I'd solve it like this:
$(function(){
setTimeout(flashOne,1000);
});
function flashOne () {
$("#one").fadeTo("slow",0.4).fadeTo("slow",1.0);
setTimeout(flashTwo,1000);
}
function flashTwo () {
$("#two").fadeTo("slow",0.4).fadeTo("slow",1.0);
setTimeout(flashThree,1000);
}
function flashThree () {
$("#three").fadeTo("slow",0.4).fadeTo("slow",1.0);
setTimeout(flashOne,1000);
}
Like this it's not possible for the timers to mess up, as it's always delayed one second after the item before has flashed.
Consider using a chained setInterval instead as this give a guaranteed slot to the browser. Reference this SO post..
Currently you only use setInterval to start the animation. From there jQuery is handling the "oscillations".
Theoretically using a chained set interval should guarantee a slot, to the browser. More importantly, you can hard code the offset into the code at each interval, instead of only once at the beginning.
The setTimeout() and setInterval() functions do not guarantee that your events run exactly on schedule. CPU load, other browser tasks, and similar can and will affect your timers, therefore they are not reliable enough for your use case.
A solution for this would be asynchronous events (promises or similar) or using the event queue that jQuery supplies. That way you could either nest with callbacks, or queue them up and then fire the queue over again once it hits the last item in the queue. The .queue() API documentation page has an example of this.
Related
Suppose I have a callback firing perpetually as the result of some event; i.e. Someone's moving a mouse.
I'd like to run a cleanup action if the callback hasn't fired in x seconds; i.e. If they haven't moved the mouse in 2 seconds, fire.
I think I could probably fix something up with setTimeout, but I'm wondering if any standard libraries have a function for this? Sort of a 'dead-mans-switch', seems like it would be common enough to have a standard method. If not I'm making one. Anyone?
De-bouncing may be a technique that will help.
It is essentially a method of wrapping a function so that you have control over when the wrapped function will execute, regardless of how often the debounced version is called.
This is most commonly used for events, like window resize. Then you can only execute your handler once the user has finished resizing the window rather then whilst they are resizing it.
There is also throttling, this is similar but has important differences.
Throttled functions will execute once every n time rather than a debounced version which will executed after it hasn't be called for n time.
underscore and lodash have implementations of de-bouncing and throttling.
However they it is quite easy to achieve and you don't really need a large library if its not already being used.
I think you're on the right track about setTimeout. As per your wonder, I am not aware of a module that would do it. And due to the intrusive nature of this process, it makes sense.
You could do this tho:
var yourmodule; //assuming you're using a module to store your app code; the object should obviously exist before continuing
yourmodule.cleanupSequenceId = -1;
function yourEventCallback() {
if (yourmodule.cleanupSequenceId !== -1) clearTimeout(yourmodule.cleanupSequenceId);
//function logic
//cleanup:
yourmodule.cleanupSequenceId = setTimeout(cleanupMethod, 2000);
}
After stumbling upon this (very old) question, and reading many others like it, I found a solution that works for me so I wanted to share it.
You define a "Debounce" function like this:
var debounce_timeout // Global debouncer timer, so all calls target this specific timeout.
function debounce(func, delay = 2000) {
clearTimeout(debounce_timeout)
debounce_timeout = setTimeout(() => {
func()
}, delay)
}
Now if you wish to debounce some function, you do:
debounce(myFunction)
Debouncing essentially means, that when your function is called, we observe for 'delay' duration, if any other calls to the function is made. If another call is made, we reset our observing time.
I have a simple JavaScript function that manipulates the DOM (heights of elements for layout reasons).
The function get's called on window's resize (throttled to 1s) and on button click.
In my function everything is wrapped inside a _.delay() function in order for the script to wait 1s for a triggered animation to finish.
The problem is that sometimes the function get's called fast on after another and the second call starts before the first call ending. Now the function calls are doing their things simultaneously and everything get's bad.
My question:
How can I tell the function to only run one at a time? Some kind of lock would be good that locks the second call from executing. It would be great if this second call still executes, but only after the first call remove the lock.
Is something like this possible in JavaScript and how?
EDIT
Here is a code example of how the script looks like:
function doStuff() {
var stuff = $('[data-do-stuff]');
var height = stuff.height();
// Add CSS class that changes height of stuff
// Class starts an animation of duration of 1s
stuff.addClass('active-stuff');
// Wait 1s for the animation started by added class
_.delay(function() {
stuff.height(height * 42);
}, 1000);
}
$(window).on('resize', _.throttle(function() {
doStuff();
}, 1000));
$('.tasty-button').on('click', function() {
doStuff();
});
This is not a working example, just a gist of what the general structure of my script is.
If I e.g. click multiple times on the tasty button (about 3x in 1s) it messes with everything. (In my real script, I have got more trigger so just disabling the button for 1 second doesn't do the trick -.-)
I would like it to behave like this: If doStuff executes, lock every call until doStuff finishes with executing and then execute the locked calls afterwards.
PROMISES in Javascript is what you are looking for.
Without code examples, it's hard to suggest solutions specific to your question. However, here's some thoughts on your overall problem:
What you're experiencing is a called a "race condition" where a part of your application depends on multiple functions finishing at undetermined times.
Generally, there are two ways to handle situations like this:
1) Use callbacks. About Callbacks
2) As another user suggested, use JS promises. About JS Promises
I need the functionality of animating the z-index property of a specific HTML object. I've been able to achieve this animation in two ways that both have their difficulties/drawbacks. Successfully answering this question for me will fix one of the following two issues:
The first is by adapting the JQuery animate command with the step functionality outlined here by the accepted answer:
jQuery's $('#divOne').animate({zIndex: -1000}, 2000) does not work?
The problem with this method for me is that the $('#obj').stop(); command cannot prematurely end the animation when done in this way. It always finishes unless I destroy the object I'm working with and create a new one (which causes blinking obviously). If anyone knows of a way to properly stop a step animation like this, or a work-around for the issue, I'd love to see it.
var div = $('#obj');
$({z: ~~div.css('zIndex')}).animate({z: [-2000, 'linear']}, {
step: function() {
div.css('zIndex', ~~this.z);
},
duration: 10000
});
The second is using a setInterval loop on 20 MS (a speed that is sufficient for my needs) to simply adjust the z-index to what it should be at that point of the "animation". This works great for a few moments, then something causes it to stop working suddenly. The code still runs through the $('#obj').css('z-index', val); line, and val is changing, but it no longer updates the object in the DOM. I've tried it on slower timer settings as well with identical results. Anyone know why JQuery might suddenly no longer be able to set the Z-Index?
function () move {
if (!(MoveX == 0 && MoveY == 0))
{
$('#obj').css('z-index', val);
}
}
$('#obj').stop() doesn't work for you because the animation isn't being performed on $('#obj').
It is being performed on the object $({z: ...}) (with a custom step function). This means you should do something like
var anim = $({z: ~~div.css('zIndex')}).animate({z: [-2000, 'linear']}, {
step: function() {
div.css('zIndex', ~~this.z);
},
duration: 10000
});
// sometime later
anim.stop();
See this demo.
Edit For what it's worth, here is the same demo using an animation interval. I see a syntax error in your second snippet: the function declaration should be
function move() { ...
I assume that's a typo since your code wouldn't even parse. Other than that, I'm not sure why that solution didn't work for you.
When looking to improve a page's performance, one technique I haven't heard mentioned before is using setTimeout to prevent javascript from holding up the rendering of a page.
For example, imagine we have a particularly time-consuming piece of jQuery inline with the html:
$('input').click(function () {
// Do stuff
});
If this code is inline, we are holding up the perceived completion of the page while the piece of jquery is busy attaching a click handler to every input on the page.
Would it be wise to spawn a new thread instead:
setTimeout(function() {
$('input').click(function () {
// Do stuff
})
}, 100);
The only downside I can see is that there is now a greater chance the user clicks on an element before the click handler is attached. However, this risk may be acceptable and we have a degree of this risk anyway, even without setTimeout.
Am I right, or am I wrong?
The actual technique is to use setTimeout with a time of 0.
This works because JavaScript is single-threaded. A timeout doesn't cause the browser to spawn another thread, nor does it guarantee that the code will execute in the specified time. However, the code will be executed when both:
The specified time has elapsed.
Execution control is handed back to the browser.
Therefore calling setTimeout with a time of 0 can be considered as temporarily yielding to the browser.
This means if you have long running code, you can simulate multi-threading by regularly yielding with a setTimeout. Your code may look something like this:
var batches = [...]; // Some array
var currentBatch = 0;
// Start long-running code, whenever browser is ready
setTimeout(doBatch, 0);
function doBatch() {
if (currentBatch < batches.length) {
// Do stuff with batches[currentBatch]
currentBatch++;
setTimeout(doBatch, 0);
}
}
Note: While it's useful to know this technique in some scenarios, I highly doubt you will need it in the situation you describe (assigning event handlers on DOM ready). If performance is indeed an issue, I would suggest looking into ways of improving the real performance by tweaking the selector.
For example if you only have one form on the page which contains <input>s, then give the <form> an ID, and use $('#someId input').
setTimeout() can be used to improve the "perceived" load time -- but not the way you've shown it. Using setTimeout() does not cause your code to run in a separate thread. Instead setTimeout() simply yields the thread back to the browser for (approximately) the specified amount of time. When it's time for your function to run, the browser will yield the thread back to the javascript engine. In javascript there is never more than one thread (unless you're using something like "Web Workers").
So, if you want to use setTimeout() to improve performance during a computation-intensive task, you must break that task into smaller chunks, and execute them in-order, chaining them together using setTimeout(). Something like this works well:
function runTasks( tasks, idx ) {
idx = idx || 0;
tasks[idx++]();
if( idx < tasks.length ) {
setTimeout( function(){ runTasks(tasks, idx); },1);
}
}
runTasks([
function() {
/* do first part */
},
function() {
/* do next part */
},
function() {
/* do final part */
}
]);
Note:
The functions are executed in order. There can be as many as you need.
When the first function returns, the next one is called via setTimeout().
The timeout value I've used is 1. This is sufficient to cause a yield, and the browser will take the thread if it needs it, or allow the next task to proceed if there's time. You can experiment with other values if you feel the need, but usually 1 is what you want for these purposes.
You are correct, there is a greater chance of a "missed" click, but with a low timeout value, its pretty unlikely.
I'm creating a error message displaying box which slides out, delays for 3 seconds and then slides in with Mootools. This is what I'm currently doing now, how can I correct it to get it work for me?
var slide = new Fx.Slide($("error"));
slide.slideOut('horizontal').chain(function(){
$("error").set("text", message);
}).chain(function(){
this.slideIn('horizontal').delay(3000);
}).chain(function(){
this.slideOut('horizontal');
});
You basically have your mootools correct, but are missing a few key items that would make your script function properly. I have pasted a working version below, and then made some comments:
var slide = new Fx.Slide($("error"));
slide.slideOut('horizontal').chain(function () {
$('error').set('text', message); this.callChain(); //NOTE
}).chain(function () {
this.slideIn('horizontal');
}).chain(function () {
this.slideOut.delay(3000, this, 'horizontal'); //NOTE
});
Notice the this.callChain() on the
3rd line. Not having this was what
was stopping you seeing anything.
The Fx class uses the callChain()
method internally to start the next
step in the sequence, but if your
argument to chain() doesn't contain
one of Fx's methods, callChain() is
not called, so you have to do it
manually.
Your call to delay was in the wrong place. Delay() delays the execution of the function it is applied to, it does not insert a pause into a chain. Therefore to display the error message for 3sec you need to add delay to the the last function call, because this is the one you want to slow down
Your call to delay was incorrect. Delay applies to the function, not the return value of the function, hence Dimitar's suggestion above. Have a look at function in the mootools core documentation for more info
By the sounds of it, you do not have firebug installed. This would have let you explore the DOM to find that your code changes the margins and then the text, but nothing happens after that. Firebug is super useful, so install it ASAP
My solution (mootools 1.3) is below, and basically relfects what dimitar was suggesting:
$('error').set('slide', {
mode: 'horizontal'
}).get('slide').slideOut().chain(function () {
$('error').set('text', message); this.slideIn();
}, function () {
this.slideOut.delay(3000, this);
});
Hope it helps