JQuery domready and window ready randomly working? - javascript

I've noticed lately that sometimes the domready and window.load does not work. It's like randomly working when entering the page, and or refreshing.
Say I have:
$(function(){
$('.hide').hide();
// disable html5 native validation to let jquery handle
$('form').attr('novalidate','novalidate');
});
$(window).load(function(){
$('.input').click(function(){
$(this).animate({opacity:0.8});
}).blur(function(){
$(this).animate({opacity:1});
});
});
Sometimes when I load the page, the element is not getting hidden, sometimes it is, the input fields will animate, sometimes not, and both don't necessary fail together. If I refresh the page a few times, it will work.
I always thought that domready will execute as soon as the doms are ready, and window.load will wait until everything on the page is rendered ready? Or is this more bugs from HTML5?
Question is: am I missing something or just misunderstanding something?
Edit: Notably Chromium. I am on Ubuntu, so I would not be surprised if it was a chromium bug.

Be aware that if you have a very complex html structure, it may delay the time for the dom to become ready. The browser probably tries to render the page as quickly as it possibly can, and with a really complex page, it's possible that the rendering will begin and the domready event will trigger, but the browser will quickly render stuff before the specific code that you set up gets triggered.
A block in jQuery domready happens as fast as it can, but if you put, say:
setTimeout(function(){ $().ready(function(){alert('finally');});}, 9000);
That "as fast as it can" is still going to be limited by where the code occurs, in this case after a 9 second timeout.

Related

How to detect DOM update completion after AJAX page load in a Chrome Extension?

I'm trying to identify roughly when the DOM is finished updating after a page is loaded via AJAX on any arbitrary website.
My current method first listens for the chrome.webNavigation.onHistoryStateUpdated event in a background script, then executes a content script in which a MutationObserver detects changes to the website's body. From there, unfortunately, it seems like it's a bit more finicky. If I just wait for the first mutation where nodes are added to the DOM, I wind up in many cases (YouTube, to give one example) where the page is still blank. Other more hacky approaches I've considered include things like just using setTimeout or waiting for the page to reach a certain length, but those seem clearly wide open to exception cases.
Is there a more fool-proof way to detect that the DOM has roughly finished updating? It doesn't necessarily have to be perfectly precise, and erring on the side of triggering late in my use case is better than triggering early. Also it isn't important at all that resources like video and images be fully loaded, just that the text contents of the page are basically in place.
Thanks for your help!

Avoiding a freeze when executing all javascript functions onDomReady / onDomLoaded

So I want to migrate all my javascript functions to requireJS and will try to use as much as possible the ondomready event. BUT:
this will freeze the browser, since all javascript is synchronously. This is bad. I mean wow the user sees the browser content a bit faster, but is going to try to click somewhere just to realize the browser was frozen, and has to click again. This is very bad. Is there a way around this?
Patient: It hurts when I do this.
Doctor: Then don't do that.
If you see freezing on the dom ready event then perhaps you are trying to do too much. Executing javascript is quick. Doing a page redraw is slow.
Rather than lots of small events that each make changes to the dom and each cause a page redraw you should have one function that processes a list of changes that need to be made. This is what the domReady plugin does before the ready event. After the ready event it just runs them as it receives it which could cause multiple redraws.
I learnt this while writing my own animation library. I was using individual setInterval()'s to change a single property. When doing more that four a once the animation was no longer smooth. A better way to do this is a single interval that processes a list of changes that need to be made.
Edit:
Instead of using domReady as a plugin require(["domReady!"], use it as a module so that you can run initialisation code straight away then make changes to the dom later.
require(["domReady"], function(domReady) {
var element = document.createElement('table');
//more setup code
domReady(function(){
document.body.appendChild(element);
});
});

Ninja Kill a Rogue JavaScript Event

Here's the plot, which is a True Story (a problem that exists for a real person, that is - me):
You are working on a large enterprise site, which includes a lot of JavaScript and specifically jQuery code you don't have any control of, and can't possibly change (good luck even finding out who wrote it, or why). Layers of authentication and authority are involved, so just pretend it's written in stone and you can't touch it.
Somewhere in this code, there is an event that scrolls the document to the top of the page after it has loaded. "OK, that sounds harmless" one might think - but it is now your task to scroll the page to a specific item based on a query string or anchor.
Everything works fine generally, but when you click a link that goes to example.com/list#item11, the browser works as expected and you go directly down to the item you want to link to...and then, whammo, the page instantly jumps back to the top of the page.
Now, you might say "well, that's what document.ready() is for!" ...to your horror, you find that the rogue event comes along anyway.
After Stack Overflow searching for an even later event to tie into, you find this gem:
$(document).ready(function(e) {
$(window).load(function(e){ });
}
And surely, this will definitely work! Only, it does not. You try return false and e.preventDefault(), but that does nothing for you here.
All you can be sure of is that this rogue scrolling event occurs after your code runs, after the DOM is ready, and definitely after the window.load() event. You are sure of nothing else.
Can you assassinate this rogue event? Is there some mechanism to intercept scroll events and prevent them from occurring? Can you link into some event later event, like "the DOM is ready, the window is loaded, the page is settled, the children are in bed, and all other events are done being handled.... event()`"?
The only solutions I can imagine now are "give up - scrolling behavior on page load isn't going to work in your scenario", "use a timer and wait! then commit seppuku for being such a dirty hack!", and "ninja-assassination mission!" (since I don't know who wrote the offending code, I'd have to settle for killing their code instead of them - and I'm sure they had their reasons, or have already been assassinated... or at least waiting for the code to pass and do my thing).
Is there some Better Way, some hard to find function, some last resort that invokes the arcane Dark Lords of Reflection, or is it time to give up and solve the problem another way?
TLDR;
How do you stop a disruptive scripted event - like scrolling - from occurring when you can't change the code that is causing it? Acceptable answers include how to make certain your code runs after - without using a timer hack! - and/or if your code always runs first how do you prevent the later code from messing up yours?
It might be helpful to find out how the event is defined, and what events are firing, but I feel that this is a separate question and may not necessarily be required to fix the situation. For illustration purposes, assume there are thousands of active events and listeners spread out across dozens of minified script files. It may just be so hard to narrow down what exactly is happening as to be too much trouble to even try.
The best solution will be to edit the source code where the ready event is declare.
if you can't, you can copy this code somewhere else and edit it.
if its totally not possible, then
you cannot unbind the ready event because that can cause problem.
you can override the window.scrollTop() function by its prototype
window.prototype.scrollTo2 = window.prototype.scrollTo;
window.prototype.scrollTo = function(){
/*Look in the url if you have an hash tab*/
// if yes return false
//if not
window.prototype.scrollTo2()
};
Smack them with a timer if everything else fails:
$(document).ready(function() {
window.setTimeout(function() {
document.location = "#bottom";
}, 200);
});
Live test case.
Ugly, but working.
I would hook into the window.scrollTo prototype to try and catch the burglar in the act. If you know how it's done, it's easier to get rid of it.
If this rogue call is not embedded in too huge a pile of JQuery goo, it could even allow to trace the call to the original culprit, who would soon be smitten with great vengeance and furious anger.

Firefox onLocationChange not always called

I am building a firefox extension that creates several hidden browser elements.
I would like to addProgressListener() to handle onLocationChange for the page that I load. However, my handler does not always get called.
More specifically, here's what I'm doing:
Create a browser element, without setting its src property
Attach it to another element
Add a progress listener listening for onLocationChange to the browser element
Call loadURIWithFlags() with the desired url and post data
I expect the handler to be called every time after 4, but sometimes it does not (it seems to get stuck on the same pages though).
Interestingly, if I wrap 3 and 4 inside a setTimeout(..., 5000); it works every time.
I've also tried shuffling some of the steps around, but it did not have any effect.
The bigger picture: I would like to be reliably notified when browser's contentDocument is that of the newly loaded page (after redirects). Is there a better way to do this?
Update: I've since opened a bug on mozilla's bug tracker with a minimal xulrunner app displaying this behavior, in case anybody wants to take a closer look: https://bugzilla.mozilla.org/show_bug.cgi?id=941414
In my experience developing with Firefox, I've found in some cases the initialization code for various elements acts as if it were asynchronous. In other words, when you're done executing
var newBrowser = window.document.createElement('browser');
newBrowser.setAttribute('flex', '1');
newBrowser.setAttribute('type', 'content');
cacheFrame.insertBefore(newBrowser, null);
, your browser may not actually be ready yet. When you add the delay, things have time to initialize, so they work fine. Additionally, when you do things like dynamically creating browser elements, you're likely doing something that very few have tried before. In other words, this sounds like a bug in Firefox, and probably one that will not get much attention.
You say you're using onLocationChange so that you can know when to add a load listener. I'm going to guess that you're adding the load listener to the contentDocument since you mentioned it. What you can do instead is add the load listener to the browser itself, much like you would with an iframe. If I replace
newBrowser.addProgressListener(listener);
with
newBrowser.addEventListener("load", function(e) {
console.log('got here! ' + e.target.contentDocument.location.href);
}, false);
then I receive notifications for each browser.

How to disable AJAX-y links before page Javascript ready to handle them?

I am implementing a shopping cart for my website, using a pseudo-AJAX Lightbox-esque effect. (It doesn't actually call the server between requests -- everything is just Prototype magic to update the displayed values.)
There is also semi-graceful fallback behavior for users without Javascript: if they click add to cart they get taken to an (offsite, less-desirable-interaction) cart.
However, a user with Javascript enabled who loads the page and then immediately hits add to cart gets whisked away from the page, too. I'd like to have the Javascript just delay them for a while, then execute the show cart behavior once it is ready. In the alternative, just totally ignoring clicks before the Javascript is ready is probably viable too.
Any suggestions?
I now do this with jQuery b/c I vaguely recall browser differences which jQuery takes care of:
Try
$(document).ready(function() {
// put all your jQuery goodness in here.
});
Is your code really that slow that this is an issue? I'd be willing to bet that no one is going to be buying your product that soon after loading the page. In any reasonable case, the user will wait for the page to load before interacting with it, especially for something like a purchase.
But to answer your original question, you can disable the links in normal code, then reenable them using a document.observe("dom:loaded", function() { ... }) call.
You can try, hiding the links in css then show them when the page loads.
<script type="text/javascript">
document.write('<link rel="stylesheet" type="text/css" href="someCssThatHidesLinks.css" />');
window.onLoad = function() {
//show links
}
</script>
This way, if your user doesn't have javascript, then their links are still active, otherwise if they do, then the links are hidden until you load and activate them. Never done this before, but i think this should work if you want to retain the features of your failsafe page.
However, a user with Javascript enabled who loads the page and then immediately hits add to cart gets whisked away from the page, too.
When are you starting up your scripts? If you're using document onload, it will be waiting for all the images to download before initialising, which would indeed give the user a chance to click-to-buy.
If you trigger the JS enhancement when the DOM is ready, either by just putting your <script> at the bottom of the page, or by using a DOMContentLoaded-style event supplied by your framework, the links should adapt fast enough that the user is very unlikely to be clicking the button first.
If you really, really want to delay clicks that are placed between the element first being parsed, and the document being loaded, you could write something like:
<script>
function methodcall(obj, name) {
return function() {
obj[name].call(obj);
};
}
</script>
...
Buy
...
So that it'd just spin there polling (and, on IE, leaking memory) until the link's real click handler was in place. It's pretty ugly though, and fragile in that if your script breaks for some reason (and it's JavaScript, so it's pretty likely that at some point, on some browser, it will), your button will break. This problem applies to all potential solutions that rely on a later-running JavaScript enabling previously-disabled actions.
And there's nothing worse for business than a broken buy button.

Categories

Resources