Now, I understand that it's bad practice to delay a page close, and that there are better ways to handle that kind of stuff, but just for future reference, is there a way to delay the page closing? Something like
window.onunload = unload();
function unload()
{
setTimeout("self.close()", 1000)
}
Thanks!
If you really need (ie. ready to resort to semi-hacks) to delay the page closing without showing a confirmation dialog, etc, you can do something like the following:
function delay(ms) {
var start = +new Date;
while ((+new Date - start) < ms);
}
// start image loading (I assume you need this for tracking?)
delay(150);
The caveats are obvious: it will not always work and you cannot delay for too long. That being said, if you are really interested in this, you can probably get results of over 95% (really depends on the server response time).
onbeforeunload doesn't work with timeout to protect the browser user from being.
The only way to prevent the page from exiting after the user attempts to leave is by putting synchronous code in the onbeforeunload/onunload handler
But there is something you can do!!!
for(var i = 0; i < 2000; i++){
console.log(i);
}
You can make this for loop printing to console to delay the unload of the page.
The higher number will take delay high to reload page.
Related
Say I have 20 rows of JS code and I want the interpreter to execute only half of the code (<11 rows), then stop, without functions and returns, or without commenting the rest of the code (I already tried a return, see in advance).
A location.reload(true); in row 10 is a close solution but I'm looking for a client side stop.
My question
Is there like a stop command (or functionality) in JavaScript, that asks the interpreter to stop and behave as if no code ran so far?
Why I ask
The background for this question is a problem I have calling a function in more than one keydown event.
Given the keydown event is triggered only once, I consider sending the interpreter back to the start after the keydown event was triggered disposably, and without refreshing the page (Sorry if it seems absurd, I'm new to JS and failed finding the source of the bug).
Of course, the above question is different than the question "why does the keydown event triggered only once", which I already asked here - here's a link for context.
Preventing an XY problem
On one hand, I want to make sure there is no XY problem. On the other hand, I am not allowed to copywrite the previous question to this session hence linked to it above.
Either way, I would be glad to know if what I just described (client side stop of a JS interpreter) is even possible in the current release of the language.
Note: I decided to carefully rewrite the question after some comments earlier today (there were no answers) and did my best ensuring the question is informative and communal.
There is no stop command, but I experienced the need of it before when there was a long-running client-side operation.
The solution:
1) Divide the problem into small packets
2) Make sure you are able to make your function work only for activeMilliseconds milliseconds:
function doStuff(packets, currentIndex, activeMilliseconds) {
var start = new Date(); //Start of chunk
while((currentIndex < packets.length) && (new Date() - start < activeMilliseconds)) {
//Do something with packets[currentIndex]
currentIndex++;
}
return currentIndex;
}
3) Now that we are able to work for activeMilliseconds milliseconds, we need to use this asynchronously:
//Define packets
var currentIndex = 0;
var intervalID = setTimeout(function() {
If(currentIndex = doStuff(packets, currentIndex, activeMilliseconds) >= packets.length) clearInterval(intervalID);
}, totalMilliseconds);
Node: totalMilliseconds > activeMilliseconds should be true. For example, if totalMilliseconds is 250, and activeMilliseconds is 200, then in each 250 milliseconds a chunk will run for 200 milliseconds, leaving the browser to do its stuff for 50 milliseconds every 250 milliseconds even if there is a lot of work to do.
4) Make sure a job stops a previous similar job:
function doJob(packets, intervalID, activeMilliseconds, totalMilliseconds) {
clearInterval(intervalID);
//Define packets
var currentIndex = 0;
var intervalID = setTimeout(function() {
If(currentIndex = doStuff(packets, currentIndex, activeMilliseconds) >= packets.length) clearInterval(intervalID);
return intervalID;
}, totalMilliseconds);
}
If you use this idea for your key event, then it will stop the previous keyboard, your maximum wait time to do so will be activeMilliseconds, which is an acceptable compromise in my opinion.
That said, this methodology should be only used in the case when you have no other option. You need to know that Javascript has a single thread, so even if you trigger a function execution while a previous instance of the event is still running, your new event will sequentially be executed when the other event is finished.
I'm writing a "Game of Life" in javascript. I have all the logic done in a function called doGeneration(). I can repeatedly call this from the console and everything goes as planned, however, if I put it in a while loop the execution blocks the UI and I just see the end result (eventually).
while (existence) {
doGeneration();
}
If I add a setTimeout(), even with a generation limit of say 15, the browser actually crashes (Canary, Chrome).
while (existence) {
setTimeout(function() {
doGeneration();
},100);
}
How can I call doGeneration() once every second or so without blocking the DOM/UI?
You want setInterval
var intervalId = setInterval(function() {
doGeneration();
}, 1000);
// call this to stop it
clearInterval(intervalId);
I would use requestAnimationFrame(doGeneration). The idea of this function is to let the browser decide at what interval the game logic or animation is executed. This comes with potential benefits.
https://hacks.mozilla.org/2011/08/animating-with-javascript-from-setinterval-to-requestanimationframe/
Rather than using setINterval or setTimeout and assume some random time interval will be enough for the UI to update you shoul/could make the doGeneration smart enough to call itself after dom was updated and if the condition of existence is satisfied.
This is a very simple use case. Show an element (a loader), run some heavy calculations that eat up the thread and hide the loader when done. I am unable to get the loader to actually show up prior to starting the long running process. It ends up showing and hiding after the long running process. Is adding css classes an async process?
See my jsbin here:
http://jsbin.com/voreximapewo/12/edit?html,css,js,output
To explain what a few others have pointed out: This is due to how the browser queues the things that it needs to do (i.e. run JS, respond to UI events, update/repaint how the page looks etc.). When a JS function runs, it prevents all those other things from happening until the function returns.
Take for example:
function work() {
var arr = [];
for (var i = 0; i < 10000; i++) {
arr.push(i);
arr.join(',');
}
document.getElementsByTagName('div')[0].innerHTML = "done";
}
document.getElementsByTagName('button')[0].onclick = function() {
document.getElementsByTagName('div')[0].innerHTML = "thinking...";
work();
};
(http://jsfiddle.net/7bpzuLmp/)
Clicking the button here will change the innerHTML of the div, and then call work, which should take a second or two. And although the div's innerHTML has changed, the browser doesn't have chance to update how the actual page looks until the event handler has returned, which means waiting for work to finish. But by that time, the div's innerHTML has changed again, so that when the browser does get chance to repaint the page, it simply displays 'done' without displaying 'thinking...' at all.
We can, however, do this:
document.getElementsByTagName('button')[0].onclick = function() {
document.getElementsByTagName('div')[0].innerHTML = "thinking...";
setTimeout(work, 1);
};
(http://jsfiddle.net/7bpzuLmp/1/)
setTimeout works by putting a call to a given function at the back of the browser's queue after the given time has elapsed. The fact that it's placed at the back of the queue means that it'll be called after the browser has repainted the page (since the previous HTML changing statement would've queued up a repaint before setTimeout added work to the queue), and therefore the browser has had chance to display 'thinking...' before starting the time consuming work.
So, basically, use setTimeout.
let the current frame render and start the process after setTimeout(1).
alternatively you could query a property and force a repaint like this: element.clientWidth.
More as a what is possible answer you can make your calculations on a new thread using HTML5 Web Workers
This will not only make your loading icon appear but also keep it loading.
More info about web workers : http://www.html5rocks.com/en/tutorials/workers/basics/
I have a simple html page containing a large table with more than 2000 rows. I have jQuery code written for searching and sorting in that table. It takes quite some time for searching and sorting (which is understandable).
What I want is to have a screen blocker in place when script is searching or sorting the table. This behavior is observable on AJAX calls on many websites that can be achieved by implementing onAjaxBegin and onAjaxComplete events of jQuery.
Is there any such method that can be used to put a screen blocker for long running script. if not, what is the alternative?
I would recommend breaking it up and iterate with setTimeout.
For example, instead of:
function example1() {
for (var i = 0; i < 1000; i++) {
// SOME CODE
}
}
You could write:
function example2() {
var i = 0;
helper();
function helper() {
// SOME CODE
if (++i < 1000) {
setTimeout(helper, 0);
}
}
}
You don't have to have every iteration in different callback. You could convert 1000 iterations in 1 function call to 10 iterations per function call in 100 function calls or something that would be most suitable in your case. The idea is to not block the user interface for so long that the user will notice.
Another idea would be to use Web Workers if you can but this will not work on older browsers (which may or may not be a problem for you, if you're writing a browser extension or you know what your users will use, etc.).
If you do it the way you explained in your question then you will make the browser completely unresponsive during your calculations and you will most likely trigger a "slow script - do you want to kill it?" kind of warning.
jQuery blockUI will block elements or the page and is very customizable.
When looking to improve a page's performance, one technique I haven't heard mentioned before is using setTimeout to prevent javascript from holding up the rendering of a page.
For example, imagine we have a particularly time-consuming piece of jQuery inline with the html:
$('input').click(function () {
// Do stuff
});
If this code is inline, we are holding up the perceived completion of the page while the piece of jquery is busy attaching a click handler to every input on the page.
Would it be wise to spawn a new thread instead:
setTimeout(function() {
$('input').click(function () {
// Do stuff
})
}, 100);
The only downside I can see is that there is now a greater chance the user clicks on an element before the click handler is attached. However, this risk may be acceptable and we have a degree of this risk anyway, even without setTimeout.
Am I right, or am I wrong?
The actual technique is to use setTimeout with a time of 0.
This works because JavaScript is single-threaded. A timeout doesn't cause the browser to spawn another thread, nor does it guarantee that the code will execute in the specified time. However, the code will be executed when both:
The specified time has elapsed.
Execution control is handed back to the browser.
Therefore calling setTimeout with a time of 0 can be considered as temporarily yielding to the browser.
This means if you have long running code, you can simulate multi-threading by regularly yielding with a setTimeout. Your code may look something like this:
var batches = [...]; // Some array
var currentBatch = 0;
// Start long-running code, whenever browser is ready
setTimeout(doBatch, 0);
function doBatch() {
if (currentBatch < batches.length) {
// Do stuff with batches[currentBatch]
currentBatch++;
setTimeout(doBatch, 0);
}
}
Note: While it's useful to know this technique in some scenarios, I highly doubt you will need it in the situation you describe (assigning event handlers on DOM ready). If performance is indeed an issue, I would suggest looking into ways of improving the real performance by tweaking the selector.
For example if you only have one form on the page which contains <input>s, then give the <form> an ID, and use $('#someId input').
setTimeout() can be used to improve the "perceived" load time -- but not the way you've shown it. Using setTimeout() does not cause your code to run in a separate thread. Instead setTimeout() simply yields the thread back to the browser for (approximately) the specified amount of time. When it's time for your function to run, the browser will yield the thread back to the javascript engine. In javascript there is never more than one thread (unless you're using something like "Web Workers").
So, if you want to use setTimeout() to improve performance during a computation-intensive task, you must break that task into smaller chunks, and execute them in-order, chaining them together using setTimeout(). Something like this works well:
function runTasks( tasks, idx ) {
idx = idx || 0;
tasks[idx++]();
if( idx < tasks.length ) {
setTimeout( function(){ runTasks(tasks, idx); },1);
}
}
runTasks([
function() {
/* do first part */
},
function() {
/* do next part */
},
function() {
/* do final part */
}
]);
Note:
The functions are executed in order. There can be as many as you need.
When the first function returns, the next one is called via setTimeout().
The timeout value I've used is 1. This is sufficient to cause a yield, and the browser will take the thread if it needs it, or allow the next task to proceed if there's time. You can experiment with other values if you feel the need, but usually 1 is what you want for these purposes.
You are correct, there is a greater chance of a "missed" click, but with a low timeout value, its pretty unlikely.