I have got an issue with the javascript method setTimeout() executing correctly on mobile safari.
My code is as follows:
function addBlock() {
if(i < full) {
$('#box-'+i).removeClass('empty');
$('#box-'+i).addClass('full');
i++;
setTimeout(addBlock, 20);
}
else {
if(fullcheck != Math.round(fullcheck)) {
i = i++;
$('#box-'+i).removeClass('empty');
// $('#box-'+i).addClass('halfbox');
$('#total-count').animate({height: barheight}, 5000);
}
if(usergiven) {
$('#box-'+randbox).css('border', '1px SOLID #FF0000');
$('#box-'+randbox).css('background-color', '#FF0000');
}
}
}
No matter what timeout value I provide the setTimeout function with, it always seems to run at the same speed.
The idea is that it populates a set of blocks at a faster speed than 1 every 2 seconds ( the actual amount should be about 50 a second iirc ).
Can anyone tell me why the mobile safari is not properly executing this function or what I am doing wrong?
Thanks!
There are quite a few things that are hard to reason with regarding this question. One is the amount of variables without declaration or expressions without comments on what they should do (as in fullcheck != Math.round(fullcheck). WTF!?), the other is how this code behaves as opposed to how you would like to have it behave.
Since the code is incomplete and not runnable one just has to guess as to what you mean. I am guessing you are saying that the code completes in about two seconds as it is now, and you would like it to complete in fifty seconds.
Without knowing the values of i and full it is impossible to do any calculations on the right timeouts, so I have to guess what you are doing wrong. With the current rate of 20ms in the timeout, you will process about 500 rows per second. If it currently runs in two seconds, that means you have 4000 rows. If they should be filled in fifty seconds, you will have to up your timeout to 500ms.
Related
As I was looking for a simple Stopwatch implementation in JS, I found this code http://codepen.io/_Billy_Brown/pen/dbJeh/
The problem is that it's not working fine, the clock go way too fast. I got 30 seconds on the screen when i got only 23 seconds on my watch.
And I don't understand why. The timer function is called every millisecond and should be updating the time correctly.
setInterval(this.timer, 1);
Is the problem coming from the browser or from the JS code.
Thanks in advance.
The timers in Javascript doesn't have millisecond precision.
There is a minimum time for the interval, which differs depending on the browser and browser version. Typical minimums are 4 ms for recent browsers and 10 ms for a little older browsers.
Also, you can't rely on the callback being called at exact the time that you specify. Javascript is single threaded, which means that if some other code is running when the timer triggers a tick, it has to wait until that other code finishes.
In fact the code you gave is imitating time flow, but it is not synchronized with system time.
Every millisecond it just invokes the function this.time, which performs recounting of millis, seconds and so on
without getting native system time, but just adding 1 to variable representing "imaginary milliseconds".
So we can say that resulting pseudo-time you see depends on your CPU, browser and who knows what else.
On our modern fantastically fast computers the body of this.time function is being executed faster than millisecond (wondering what would happen on Pentium 2 with IE5 on board).
Anyhow there is no chance for the this.time to be executed exactly in particular fixed period on all computers and browsers.
The simplest correct way to show the time passed since startpoint according to the system time is:
<body>
<script>
var startTime = new Date() // assume this initialization to be start point
function getTimeSinceStart()
{
var millisSinceStart = new Date() - startTime
, millis = millisSinceStart % 1000
, seconds = Math.floor(millisSinceStart / 1000)
return [seconds, millis].join( ':' )
}
(function loop()
{
document.title = getTimeSinceStart() // look on the top of page
setTimeout( loop, 10 )
}())
</script>
</body>
P.S. What #Guffa says in his answer is correct (generally for js in browsers), but in this case it does not matter and not affect the problem
I'm currently creating a countdown using setInterval though at the moment it runs slower than it should. According to the MDN, the delay parameter is in milliseconds however it isn't accurate.
I compared my countdown to the one on my phone and the phone runs nearly 5 times faster.
var count = setInterval( function() {
if (iMil == 0) {
if (iS == 0) {
if (iMin == 0) {
if (iH == 0) {
// DONE
} else {
iH--;
iMin = 59;
iS = 59;
iMil = 999;
}
} else {
iMin--;
iS = 59;
iMil == 999;
}
} else {
iS--;
iMil = 999;
}
} else {
iMil--;
}
hours.text(iH);
minutes.text(iMin);
seconds.text(iS);
milliseconds.text(iMil);
}, 1 );
This is the main part of my script. The variables hours, minutes, seconds and milliseconds are jQuery object elements.
What I'm getting at is, is there a reason that it runs slower than it is supposed too?
setInterval() is not guaranteed to run perfectly on time in javascript. That's partly because JS is single threaded and partly for other reasons. If you want to display a time with setInterval() then get the current time on each timer tick and display that. The setInterval() won't be your timer, but just a recurring screen update mechanism. Your time display will always be accurate if you do it that way.
In addition, no browser will guarantee a call to your interval at 1ms intervals. In fact, many browsers will never call setInterval more often than every 5ms and some even longer than that. Plus, if there are any other events happening in the browser with other code responding to those events, the setInterval() call might be delayed even longer. The HTML5 spec proposes 4ms as the shortest interval for setTimeout() and 10ms as the shortest interval for setInterval(), but allows the implementor to use longer minimum times if desired.
In fact, if you look at this draft spec for timers, step 5 of the algorithm says:
If timeout is less than 10, then increase timeout to 10.
And, step 8 says this:
Optionally, wait a further user-agent defined length of time.
And, it includes this note:
This is intended to allow user agents to pad timeouts as needed to
optimise the power usage of the device. For example, some processors
have a low-power mode where the granularity of timers is reduced; on
such platforms, user agents can slow timers down to fit this schedule
instead of requiring the processor to use the more accurate mode with
its associated higher power usage.
All timeout/interval/schedule functions are excepted to be run slower.
It is a nature of computer and very common in OS that there are many things CPU need to handle and too costly(and not possible) as a real-time system.
If you read theirs API https://developer.mozilla.org/en/docs/Web/API/window.setTimeout and https://developer.mozilla.org/en/docs/Web/API/window.setInterval , it said "AFTER a specified delay" and "fixed time delay between each call". They are not saying not "on a specified time" nor "called on fixed period"
Can setInterval function in JavaScript slow down a browser or even cause a browser crash?
Let's say that I have a page with thousands (about 10,000 of them) of <div>'s and I loop throught them and append them some HTML, like this:
var counter = 0;
setInterval(function() {
$('div').each(function(i, e) {
counter++;
$(this).html('Added contents for DIV at index: ' + counter);
});
}, 1);
I have intentionally set the counter++ under the each function so we might slow down the process of this script execution, and I know that I could have used just the i parameter from the jQuery's $.each() function.
The interval time of repetition is 1 milisecond and I would like to know:
Is it secure to have intervals work at this low rate?
Can setInterval function in JavaScript slow down a browser or even cause a browser crash?
No. It depends on what you are performing in the setInterval function.
10000 divs is very huge the browser will crash because of lot of reflow. But however if you use fragment you can overcome the browser crash issue.
Is it secure to have intervals work at this low rate?
You have to answer this question. If you think the job you are going to do in setInterval is going to take 1 millisecond then it's not an issue.
Preface: I have a demo of the problem on my personal site (I hope this is ok. If not, I can try to set it up on jsfiddle). I'm intending this question to be a little fun, while also trying to understand the time functions take in javascript.
I'm incrementing the value of progress bars on a timeout. Ideally (if functions run instantaneously) they should fill at the same speed, but in the real world, they do not. The code is this:
function setProgress(bar, myPer) {
bar.progressbar({ value: myPer })
.children('.ui-progressbar-value')
.html(myPer.toPrecision(3) + '%')
.attr('align', 'center');
myPer++;
if(myPer == 100) { myPer = 0; }
}
function moveProgress(bar, myPer, inc, delay){
setProgress(bar, myPer);
if(myPer >= 100) { myPer = 0; }
setTimeout(function() { moveProgress(bar, myPer+inc, inc, delay); }, delay);
}
$(function() {
moveProgress($(".progressBar#bar1"), 0, 1, 500);
moveProgress($(".progressBar#bar2"), 0, 1, 500);
moveProgress($(".progressBar#bar3"), 0, .1, 50);
moveProgress($(".progressBar#bar4"), 0, .01, 5);
});
Naively, one would think should all run (fill the progress bar) at the same speed.
However, in the first two bars, (if we call "setting the progress bar" a single operation) I'm performing one operation every 500 ms for a total of 500 operations to fill the bar; in the third, I'm performing one operation every 50ms for a total of 5,000 operations to fill the bar; in the fourth, I'm performing one operation every 5ms for a total of 50,000 operations to fill the bar.
What part of my code is takes the longest, causes these speed differences, and could be altered in order to make them appear to function in the way that they do (the fourth bar gets smaller increments), but also run at the same speed?
The biggest problem with using setTimeout for things like this is that your code execution happens between timeouts and is not accounted for in the value sent to setTimeout. If your delay is 5 ms and your code takes 5 ms to execute, you're essentially doubling your time.
Another factor is that once your timeout fires, if another piece of code is already executing, it will have to wait for that to finish, delaying execution.
This is very similar to problems people have when trying to use setTimeout for a clock or stopwatch. The solution is to compare the current time with the time that the program started and calculate the time based on that. You could do something similar. Check how long it has been since you started and set the % based on that.
What causes the speed difference two things: first is the fact that you executing more code to fill the bottom bar (as you allude to in the 2nd to last paragraph). Also, every time you set a timeout, your browser queues it up... the actual delay may be longer than what you specify, depending on how much is in the queue (see MDN on window.setTimeout).
Love the question, i don't have a very precise answer but here are my 2 cents:
Javascript is a very fast language that deals very well with it's event loop and therefore eats setTimeouts and setIntervals for breakfast.
There are limits though, and they depend on a large number of factors, such as browser and computer speed, quantity of functions you have on the event loop, complexity of the code to execute and timeout values...
In this case, i think it's obvious that if you try to execute one function every 500ms, it is going to behave a lot better than executing it every 50ms, therefore a lot better than every 5ms. If you take into account that you are running them all on top of each other, you can predict that the performance will not be optimal.
You can try this exercise:
take the 500ms one, and run it alone. mark the total time it took to fill the bar (right here you will see that it's going to take a little longer than predicted).
try executing two 500ms timeouts at the same time, and see that the total time just got a bit longer.
If you add the 50ms to it, and then the 5ms one, you will see that you will lose performance everytime...
How can I tell if the canvas's slow performance is caused by the drawing itself, or the underlying logic that calculates what should be drawn and where?
The second part of my question is: how to calculate canvas fps? Here's how I did it, seems logical to me, but I can be absolutely wrong too. Is this the right way to do it?
var fps = 0;
setInterval(draw, 1000/30);
setInterval(checkFps, 1000);
function draw() {
//...
fps++;
}
function checkFps() {
$("#fps").html(fps);
fps = 0;
}
Edit:
I replaced the above with the following, according to Nathan's comments:
var lastTimeStamp = new Date().getTime();
function draw() {
//...
var now = new Date().getTime();
$("#fps").html(Math.floor(1000/(now - lastTimeStamp)));
lastTimeStamp = now;
}
So how's this one? You could also calculate only the difference in ms since the last update, performance differences can be seen that way too. By the way, I also did a side-by-side comparison of the two, and they usually moved pretty much together (a difference of 2 at most), however, the latter one had bigger spikes, when performance was extraordinarily low.
Your FPS code is definitely wrong
setInterval(checkFps, 1000);
No-one assures this function will be called exactly every second (it could be more than 1000ms, or less - but probably more), so
function checkFps() {
$("#fps").html(fps);
fps = 0;
}
is wrong (if fps is 32 at that moment it is possible that you have 32 frames in 1.5s (extreme case))
beter is to see what was the real time passes since the last update and calculate the realtimepassed / frames (I'm sure javascript has function to get the time, but I'm not sure if it will be accurate enough = ms or better)
fps is btw not a good name, it contains the number of frames (since last update), not the number of frames per second, so frames would be a better name.
In the same way
setInterval(draw, 1000/30);
is wrong, since you want to achieve a FPS of 30, but since the setInterval is not very accurate (and is probably going to wait longer than you say, you will end up with lower FPS even if the CPU is able to handle the load)
Webkit and Firebug both provide profiling tools to see where CPU cycles are being spent in your javascript code. I'd recommend starting there.
For the FPS calculation, I don't think your code is going to work, but I don't have any good recommendation :(
Reason being: Most (all?) browsers use a dedicated thread for running javascript and a different thread for running UI updates. If the Javascript thread is busy, the UI thread won't be triggered.
So, you can run some javascript looping code that'll "update" the UI 1000 times in succession (for instance, setting the color of some text) - but unless you add a setTimeout to allow the UI thread to paint the change, you won't see any changes until the 1000 iterations are finished.
That said, I don't know if you can assertively increment your fps counter at the end of the draw() routine. Sure, your javascript function has finished, but did the browser actually draw?
Check if you dont use some innerHTML method to debug your project. This can slow your project in a way you can't imagine, especially if you do some concatenation like this innerHTML += newDebugValues;
Or like desau said, profile your cpu usage with firebug or webkit inner debugger.