javascript setTimeout, page looks like is endlessly loading (firefox) - javascript

var checkTextValue = setTimeout(function() {
var textVal = $('p').text();
if (textVal == 'expectedValue'){
callback();
} else {
setTimeout(arguments.callee, 10);
}
},10);
i have this code,it works just fine but the problem is that in firefox the page looks like is endlessly loading.

Looks kind of useless... I mean setTimeout(checkTextValue, 10); - what are you setting there? checkTextValue is just a timeout ID, nothing else... No idea why FF would load endlessly, simply because the code is faulty...

That is because it is endlessly loading. Basically you do recursion and start another instance every ten milliseconds. Given enough time, I think it also is possible to kill your browser with this code.
Try using an onchange-eventhandler on your input field instead.

I think its a case of recusrsion. Google 'recusrion' for more clues. Just kidding. checkTextValue will be running indefinitely unless the value is 'expectedValue'.

Related

How to keep a command running in loop in browser console

I have a web element that only appears while the page (or a part of the page) is still loading and disappears when the page has been completely loaded. I would like to see precisely when this element disappears and I can do that by repeatedly running something like that in the browser console:
$("div.v-app-loading")
or alternatively:
document.getElementsByClassName('v-app-loading')
But in most cases everything happens too fast and I am unable to catch the exact moment. There must be a way to create a loop that will just run in the console and execute one of the commands I mentioned say every 0.5sec or even more frequently.
Could anyone point me to the right direction?
You can use Javascript's setInterval() as following:
function yourFunction(){
//do something here...
}
setInterval(yourFunction, 500); //Will run the function every half a second(500ms = 0.5s)
Maybe it's easier to use jQuery to detect when the page is loaded:
HTML
<body class="loading">
JS
// do something initially here
$(window).load(function () {
// do something when finished loading
$('body').removeClass('loading');
});
Edit: If you rather wanted to check for existence of an elemtent, do it in a recursive function call. You can throttle it with setTimeout, but you don't need to:
function checkElement() {
if ($('.v-app-loading').length) {
checkElement();
// or: setTimeout(checkElement, 100);
} else {
// Element disappeared
}
}
checkElement();

innerHTML can't be trusted: Does not always execute synchronously

To see the problem in action, see this jsbin. Clicking on the button triggers the buttonHandler(), which looks like this:
function buttonHandler() {
var elm = document.getElementById("progress");
elm.innerHTML = "thinking";
longPrimeCalc();
}
You would expect that this code changes the text of the div to "thinking", and then runs longPrimeCalc(), an arithmetic function that takes a few seconds to complete. However, this is not what happens. Instead, "longPrimeCalc" completes first, and then the text is updated to "thinking" after it's done running, as if the order of the two lines of code were reversed.
It appears that the browser does not run "innerHTML" code synchronously, but instead creates a new thread for it that executes at its own leisure.
My questions:
What is happening under the hood that is leading to this behavior?
How can I get the browser to behave the way I would expect, that is, force it to update the "innerHTML" before it executes "longPrimeCalc()"?
I tested this in the latest version of chrome.
Your surmise is incorrect. The .innerHTML update does complete synchronously (and the browser most definitely does not create a new thread). The browser simply does not bother to update the window until your code is finished. If you were to interrogate the DOM in some way that required the view to be updated, then the browser would have no choice.
For example, right after you set the innerHTML, add this line:
var sz = elm.clientHeight; // whoops that's not it; hold on ...
edit — I might figure out a way to trick the browser, or it might be impossible; it's certainly true that launching your long computation in a separate event loop will make it work:
setTimeout(longPrimeCalc, 10); // not 0, at least not with Firefox!
A good lesson here is that browsers try hard not to do pointless re-flows of the page layout. If your code had gone off on a prime number vacation and then come back and updated the innerHTML again, the browser would have saved some pointless work. Even if it's not painting an updated layout, browsers still have to figure out what's happened to the DOM in order to provide consistent answers when things like element sizes and positions are interrogated.
I think the way it works is that the currently running code completes first, then all the page updates are done. In this case, calling longPrimeCalc causes more code to be executed, and only when it is done does the page update change.
To fix this you have to have the currently running code terminate, then start the calculation in another context. You can do that with setTimeout. I'm not sure if there's any other way besides that.
Here is a jsfiddle showing the behavior. You don't have to pass a callback to longPrimeCalc, you just have to create another function which does what you want with the return value. Essentially you want to defer the calculation to another "thread" of execution. Writing the code this way makes it obvious what you're doing (Updated again to make it potentially nicer):
function defer(f, callback) {
var proc = function() {
result = f();
if (callback) {
callback(result);
}
}
setTimeout(proc, 50);
}
function buttonHandler() {
var elm = document.getElementById("progress");
elm.innerHTML = "thinking...";
defer(longPrimeCalc, function (isPrime) {
if (isPrime) {
elm.innerHTML = "It was a prime!";
}
else {
elm.innerHTML = "It was not a prime =(";
}
});
}

clearInterval doesn't clearInterval

var timer;
function check_element_load(){
timer = window.setInterval(function(){
console.log("still working"); // keeps running forever
if(document.getElementById("comments")){
console.log("FOUND"); // this actually runs
document.getElementsByTagName("fb:comments")[0].setAttribute('order_by', 'social');
window.clearInterval(timer); // < not effective
}
}, 50);
}
check_element_load();
I'm trying to put a script on top to keep checking if a specific element was successfully loaded in the browser, it works (the console logged " FOUND "), but when I wrote another console log to see if the interval still running. it does, it never stops and the clearInterval is completely ignored
is there anything that I missed ? I also tried using all other solutions including settimeout and the closest one to me now is the written, I just want the clearinterval to take effect after the condition returns true.
Is there anything similar to clearinterval that is more effective, kills the whole function or something?
As it stands, your code is logical correct, however if
document.getElementsByTagName("fb:comments")[0].setAttribute('order_by', 'social');
throws an error, (of course) the timeout will never be cleared. You could use SetTimeout instead.
Most likely calling getElementsByTagName("fb:comments") returns an empty set, giving you an "method setAttribute does not exist on Element undefined" Type Error.

Delay page close with Javascript?

Now, I understand that it's bad practice to delay a page close, and that there are better ways to handle that kind of stuff, but just for future reference, is there a way to delay the page closing? Something like
window.onunload = unload();
function unload()
{
setTimeout("self.close()", 1000)
}
Thanks!
If you really need (ie. ready to resort to semi-hacks) to delay the page closing without showing a confirmation dialog, etc, you can do something like the following:
function delay(ms) {
var start = +new Date;
while ((+new Date - start) < ms);
}
// start image loading (I assume you need this for tracking?)
delay(150);
The caveats are obvious: it will not always work and you cannot delay for too long. That being said, if you are really interested in this, you can probably get results of over 95% (really depends on the server response time).
onbeforeunload doesn't work with timeout to protect the browser user from being.
The only way to prevent the page from exiting after the user attempts to leave is by putting synchronous code in the onbeforeunload/onunload handler
But there is something you can do!!!
for(var i = 0; i < 2000; i++){
console.log(i);
}
You can make this for loop printing to console to delay the unload of the page.
The higher number will take delay high to reload page.

DOM style change waiting for pause

When this function is called, the style change on the "gif" element does not show up until "lotsOfProcessing()" finishes. However, when I uncomment the alert("test"), the style change is shown before the alert pops up.
What I am trying to do is have an animated gif displayed while lotsOfProcessing is running. This seemed pretty straight forward solution but it is clearly not working. Any suggestions / solutions?
function nameOfFuntion()
{
document.getElementById("gif").style.display = "inline";
//alert("test");
lotsOfProcessing();
}
JavaScript code executes on the same thread as the browser's rendering. Everything that needs to be drawn waits for JavaScript execution to complete - including the next frame of any GIF animation.
The only solution is to break your long processing code down into smaller parts and delay each part using timers.
For example:
function nameOfFuntion() {
document.getElementById("gif").style.display = "inline";
//alert("test");
lotsOfProcessing();
}
function lotsOfProcessing() {
var i = 0;
window.setTimeout(function () {
partOfIntenseProcessing();
if (i < 1000000)
i++, window.setTimeout(arguments.callee, 10);
}, 10);
}
This will delay how long it will take for your processing to complete, but between timer execution the GIF can continue to animate.
You can also take a look at Web Workers, which allow you to run JavaScript operations in a background thread. However, they are not widely implemented yet (read: not available in Internet Explorer).
Perform your heavy processing in a delayed function with window.setTimeout():
function nameOfFunction()
{
document.getElementById("gif").style.display = "inline";
window.setTimeout(lotsOfProcessing, 10);
}
That's strange indeed. Seems like lotsOfProcessing gets javascript's single thread before the dom has time to refresh, but it's the first time I hear of something like that.
You might try this (not that is not an ideal solution):
function nameOfFuntion()
{
document.getElementById("gif").style.display = "inline";
setTimeout(lotsOfProcessing, 100);
}
This is a vaguely educated guess but it may be worth trying to put document.getElementById("gif").style.display = "inline"; into a function eg.
function nameOfFuntion()
{
showGif();
//alert("test");
lotsOfProcessing();
}
function showGif() {
document.getElementById("gif").style.display = "inline";
}
My thinking is that perhaps the lotsOfProcessing() is getting hoisted to the top of nameOfFunction() because it's a function and therefore getting processed first.

Categories

Resources