I've asked about this before, and have found a few articles online regarding this subject, but for the life of me I cannot figure this out. I have a set of Javascript functions that calculate a model, but there's a ton of looping going on that makes the script take a while (~4 seconds). I don't mind the processing time, but IE prompts with a warning since there are so many executions.
I've tried optimizing my code, but I simply can't cut down the number of executions enough to bypass the IE warning. Therefore, I figure that I must use the setTimeout function to reset the counter in IE.
The code is quite long, and I just can't figure out how to properly implement setTimeout in IE. I've tried mimicking the code found here, but for some reason it ends up crashing after the 100th iteration.
I don't know if this is common at all, but would anyone mind taking a look at the code if I sent it to them? I just wouldn't want to post it on here because it's quite lengthy.
Thanks!
EDIT: I've placed my code on JSfiddle as some people have suggested. You can find it here. Thanks for the suggestion and let me know if there are any questions!
The basic approach I use is to segment the work into batches. Each batch is done in one "epoch" and at the completion of the batch, it calls setTimeout() to kick off the next epoch.
Suppose you have 1000 iterations to run; you can segment it into batches of 100.
function doTheWork(operation, cycles, callback) {
var self = this, // in case you need it
cyclesComplete = 0,
batchSize = 100;
var doOneBatch = function() {
var c = 0;
while(cyclesComplete < cycles) {
operation();
c++;
if(c >= batchSize) {
// may need to store interim results here
break;
}
cyclesComplete++;
}
if (cyclesComplete < cycles) {
setTimeout(doOneBatch, 1);
}
else {
callback(); // maybe pass results here
}
};
// kickoff
doOneBatch();
return null;
};
Related
Sorry if duplicate, but couldn't find my exact case. I'm playing around Web Worker and is pretty interesting. I was testing different cases and hit this.
Main :
var myWorker = new Worker("WK.js");
for (var i = 0; i <= 1000000000; i++) {
myWorker.postMessage(i);
}
myWorker.onmessage = function (e) {
alert(e.data);
}
Worker :
var sum = 0;
self.onmessage = function (e) {
if (e.data == 1000000000) { postMessage("done" + sum); }
sum += e.data;
}
On the worker script, I'm just summing up the passed values and post back the sum once done. The problem I face is, the above code crashes my browser(all) for this number(~1000000000) however if I move that loop to worker script, it works fine. So is there a limit for the number of postMessage calls per duration? Please note I do know this is bad code, just for testing.
Your browser may not have crashed. The problem is you have a later for loop that gets executed in the main ui that if the browser. So basically the browser is busy busy ruining your code, and can't find to any user input, I.e. It is busy, and generally this results in No responding' message in windows. The rain you don't get this in a worker is simply because that code executes in a completely separate thread (non ui).
It could be a memory / garbage collection issue. You're posting a billion messages, each at least the size of an integer, which I think in Javascript is stored just as all other numbers, so 8 bytes. Ignoring any extra overhead per message, this means that it needs to allocate at least 8gb of memory.
I have to admit a level of ignorance of the garbage collector, but it might not be able to keep up with a billion objects using 8gb of memory allocated in a short amount of time.
So is there a limit for the number of postMessage calls per duration
I suspect yes, although perhaps it's not clear what you mean by "duration" here.
I have written javascript that takes 20-30 seconds to process and I want to show the progress by updating the progress bar on my webpage.
I have used setTimeout in an attempt to allow webpage to be re-drawn.
This is how my code looks like:
function lengthyFun(...){
for(...){
var progress = ...
document.getElementById('progress-bar').setAttribute('style',"width:{0}%".format(Math.ceil(progress)));
var x = ...
// Processing
setTimeout(function(x) { return function() { ... }; }(x), 0);
}
}
It does not work, I know why it does not work, but I don't know how to refactor my code to make it work.
As you probably know, the problem here is that you main process (the one that takes a lot of time), is blocking any rendering. That's because JavaScript is (mostly) mono-threaded.
From my point of view, you have two solutions to do this.
The first one is to cut down your main process into different parts and to do the rendering between each of them. I.e. you could have something like that (using Promises) :
var processParts = [/* array of func returning promises */];
function start(){
// call the first process parts
var firstPartPromise = (processParts.shift())();
// chain it with all the other process parts interspersed by updateDisplay
return processParts.reduce(function(prev, current){
return val.then(current).then(updateDisplay);
}, firstPartPromise);
}
You will probably need a polyfill for the promises (one here). If you use jQuery, they have a (bad non standard) implementation.
The second solution can be to use webworkers which allows you to create threads in JavaScript. It works on all modern browsers.
It is probably the best solution in your case.
I never used them but you are supposed to be able to do stuff like:
var process = new Worker("process.js");
worker.onmessage(function(event){
updateProgress(event.data.progress)
});
And the in process.js:
postMessage({progress: 0.1});
// stuff
postMessage({progress: 0.4});
// stuff
postMessage({progress: 0.7});
//etc
Try setting progress element attribute min to 0 , max to 20000 , value to 0 ; create function where if value less than max increment value by 1000 ; utilize setTimeout with duration set to 1000 to call function recursively until value reaches max
var p = document.querySelector("progress");
function redraw() {
if (p.value < p.max) {
p.value += 1000;
setTimeout("redraw()", 1000)
}
}
redraw()
<progress max="20000" min="0" value="0"></progress>
There are a couple of ways that I know of to trigger sequential HTML redraws through Javascript:
Incremental Timeout Period
Recursive Method Calls
The first and easiest way of doing this is by using a multiplier (such as the iterator) on the timeout interval in a loop. This method should be sufficient if the operation is independent of external variables and only needs to be run a finite and relatively few number of times. The more operations required/likely to occur, the greater the strain on resources - just for calculating intervals. Another drawback takes effect when the processing time exceeds the timeout interval, causing a knock-on to the interval of the observed redraws. The result of this can be that the web page freezes up entirely until all operations are done.
Example
for (var i=0, limit=n; i<limit; i++) {
setTimeout((function(params) {
return function() {
some_func(params);
}
})(param_values), i*1000);
}
The second method is a little more convoluted, but guarantees redraws between each operation, regardless of the timeout interval. Here, the timeout only affects the time between redraws ands resists the effects of consecutive operation variables. However, the processing time for the current operation is still a factor for the observed interval and will still freeze up a web page between redraws if the operation is computationally intensive.
Example
var limit = n;
var i = 0;
recursive_timeout();
function recursive_timeout() {
setTimeout((function(params) {
return function() {
some_func(params);
i++;
if (i<limit) {
recursive_timeout();
}
}
})(param_values, i, limit), 1000);
}
Refined Example (based off guest271314's answer)
var still_true = true;
recursive_timeout();
function recursive_timeout() {
some_func(params);
if (still_true) {
setTimeout(function() {recursive_timeout();}, 1000);
}
}
While the incremental method is fine for simple tasks, recursion will reliably perform redraws. If long processing times per operation is an issue, then it might be worth delving into asynchronous tasks in addition to using recursion in order to avoid rendering a web page temporarily unusable.
Anyway, hope this helps!
Ha! Just realised guest271314 put up a much more elegant example of the recursive method... Oh well, more info can't hurt.
I apologise in advance if I'm too bad at using the search engine and this has already been answered. Please point me in the right direction in that case.
I've recently begun to use the arguments variable in functions, and now I need to slice it. Everywhere I look people are doing things like:
function getArguments(args, start) {
return Array.prototype.slice.call(args, start);
}
And according to MDN this is bad for performance:
You should not slice on arguments because it prevents optimizations in JavaScript engines (V8 for example).
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Functions/arguments
Is there a reason why I don't see anyone doing things like this:
function getArguments(args, start) {
var i, p = 0;
var len = args.length;
var params = [];
for (i = start; i < len; ++i) {
params[p] = args[i];
p += 1;
}
return params;
}
You get the arguments you want, and no slicing is done. So from my point of view, you don't loose anything on this, well maybe it uses a little extra memory and is slightly slower, but not to the point where it really makes a difference, right?
Just wanted to know if my logic here is flawed.
Here is a discuss
and here is introduction
e.g. here uses the inline slice
It appears from the discussion that #Eason posted, (here) that the debate is in the "microptimization" category, ie: most of us will never hit those performance bumps because our code isn't being run through the kind of iterations needed to even appear on the radar.
Here's a good quote that sums it up:
Micro-optimizations like this are always going to be a trade-off
between the code's complexity/readability and its performance.
In many cases, the complexity/readability is more important. In this case, the
very slowest method that was tested netted a runtime of 4.3
microseconds. If you're writing a webservice and you're slicing args
two times per request and then doing 100 ms worth of other work, an
extra 0.0086 ms will not be noticeable and it's not worth the time or
the code pollution to optimize.
These optimizations are most helpful in really hot loops that you're hitting a gajillionty times. Use a
profiler to find your hot code, and optimize your hottest code first,
until the performance you've achieved is satisfactory.
I'm satisfied, and will use Array.prototype.slice.call() unless I detect a performance blip that points to that particular piece of code not hitting the V8 optimizer.
I'd like to continuously execute a piece of JavaScript code on a page, spending all available CPU time I can for it, but allowing browser to be functional and responsive at the same time.
If I just run my code continuously, it freezes the browser's UI and browser starts to complain. Right now I pass a zero timeout to setTimeout, which then does a small chunk of work and loops back to setTimeout. This works, but does not seem to utilize all available CPU. Any better ways of doing this you might think of?
Update: To be more specific, the code in question is rendering frames on canvas continuously. The unit of work here is one frame. We aim for the maximum possible frame rate.
Probably what you want is to centralize everything that happens on the page and use requestAnimationFrame to do all your drawing. So basically you would have a function/class that looks something like this (you'll have to forgive some style/syntax errors I'm used to Mootools classes, just take this as an outline)
var Main = function(){
this.queue = [];
this.actions = {};
requestAnimationFrame(this.loop)
}
Main.prototype.loop = function(){
while (this.queue.length){
var action = this.queue.pop();
this.executeAction(e);
}
//do you rendering here
requestAnimationFrame(this.loop);
}
Main.prototype.addToQueue = function(e){
this.queue.push(e);
}
Main.prototype.addAction = function(target, event, callback){
if (this.actions[target] === void 0) this.actions[target] = {};
if (this.actions[target][event] === void 0) this.actions[target][event] = [];
this.actions[target][event].push(callback);
}
Main.prototype.executeAction = function(e){
if (this.actions[e.target]!==void 0 && this.actions[e.target][e.type]!==void 0){
for (var i=0; i<this.actions[e.target][e.type].length; i++){
this.actions[e.target][e.type](e);
}
}
}
So basically you'd use this class to handle everything that happens on the page. Every event handler would be onclick='Main.addToQueue(event)' or however you want to add your events to your page, you just point them to adding the event to the cue, and just use Main.addAction to direct those events to whatever you want them to do. This way every user action gets executed as soon as your canvas is finished redrawing and before it gets redrawn again. So long as your canvas renders at a decent framerate your app should remain responsive.
EDIT: forgot the "this" in requestAnimationFrame(this.loop)
web workers are something to try
https://developer.mozilla.org/en-US/docs/DOM/Using_web_workers
You can tune your performance by changing the amount of work you do per invocation. In your question you say you do a "small chunk of work". Establish a parameter which controls the amount of work being done and try various values.
You might also try to set the timeout before you do the processing. That way the time spent processing should count towards any minimum the browsers set.
One technique I use is to have a counter in my processing loop counting iterations. Then set up an interval of, say one second, in that function, display the counter and clear it to zero. This provides a rough performance value with which to measure the effects of changes you make.
In general this is likely to be very dependent on specific browsers, even versions of browsers. With tunable parameters and performance measurements you could implement a feedback loop to optimize in real-time.
One can use window.postMessage() to overcome the limitation on the minimum amount of time setTimeout enforces. See this article for details. A demo is available here.
nowadays i am optimizing some js code.
there is a function named appendXYZ,and it is invoked in a loop with other functions.
it looks like as the following:
function OuterFunc (){
for(...){// about 150 times
...
appendXYZ();
//other dependent functions
...
}
}
and now i am pretty sure that appendXYZ cause high cpu usage - it can reach 50%,
but if i remove this function,cpu usage is only 1%.
when the cpu usage is 50%,the browser is nearly frozen and the page is lack of responsiveness.
what is more ,the OuterFunc execute every 20 seconds and appendXYZ is from a third party script code and i cant modify it.
so how to optimize this code?
now i am trying to use setTimeout but i dont know whether it works.
I don't know what that function does, but you could try making its invocation asynchronous.
It may or may not work, and it will still require the same amount of CPU, but it should at least free up the browser a bit.
function OuterFunc (){
for( var i = 0; i < 150; i++ ){
// ...
setTimeout( appendXYZ, 0 );
//other dependent functions
// ...
}
}
Again this may break the function. Can't tell without seeing more code.
If you're passing arguments, then you'd need something like:
function invoker( j ) {
return function() {
appendXYZ( j );
};
}
function OuterFunc (){
for( var i = 0; i < 150; i++ ){
// ...
setTimeout( invoker( i ), 0 );
//other dependent functions
// ...
}
}
If there's nothing you can do to optimize the actual code, you can spread around the execution of the loop iterations to keep the browser responsive. According to Robert Miller's paper, the maximum amount of time you can hold up a UI and still have it feel responsive to the user is 100 milliseconds. For a technique of how to do this using setTimeout see UI responsiveness and javascript.
A possibility is that the OuterFunc execution time is bigger that it's repetition interval.
In other words, the OutherFunc takes longer than 20 milliseconds to execute and being called every 20 seconds it will produce a stackoverflow exception because the function is being called before it finished it's execution in an infinite loop.
If you are using setInterval to execute the OuterFunc function every 20 milliseconds, this can be fixed by using setTimeout calls to simulate the setInterval function :
(function helper(){
OutherFunc();
// after the OutherFunc is done executing, trigger it after 20 milliseconds
setTimeout(helper, 20);
})();
This might help you only if the setInterval is the cause of the browser freeze.
If this doesn't help you and if you don't care that much about old browsers, maybe you could implement a sort of "threading" using web-workers. This way your code gets executed in different threads which will definitely speed up your app (a.k.a bye bye browser freeze).
Hope this helps!