Firefox onLocationChange not always called - javascript

I am building a firefox extension that creates several hidden browser elements.
I would like to addProgressListener() to handle onLocationChange for the page that I load. However, my handler does not always get called.
More specifically, here's what I'm doing:
Create a browser element, without setting its src property
Attach it to another element
Add a progress listener listening for onLocationChange to the browser element
Call loadURIWithFlags() with the desired url and post data
I expect the handler to be called every time after 4, but sometimes it does not (it seems to get stuck on the same pages though).
Interestingly, if I wrap 3 and 4 inside a setTimeout(..., 5000); it works every time.
I've also tried shuffling some of the steps around, but it did not have any effect.
The bigger picture: I would like to be reliably notified when browser's contentDocument is that of the newly loaded page (after redirects). Is there a better way to do this?
Update: I've since opened a bug on mozilla's bug tracker with a minimal xulrunner app displaying this behavior, in case anybody wants to take a closer look: https://bugzilla.mozilla.org/show_bug.cgi?id=941414

In my experience developing with Firefox, I've found in some cases the initialization code for various elements acts as if it were asynchronous. In other words, when you're done executing
var newBrowser = window.document.createElement('browser');
newBrowser.setAttribute('flex', '1');
newBrowser.setAttribute('type', 'content');
cacheFrame.insertBefore(newBrowser, null);
, your browser may not actually be ready yet. When you add the delay, things have time to initialize, so they work fine. Additionally, when you do things like dynamically creating browser elements, you're likely doing something that very few have tried before. In other words, this sounds like a bug in Firefox, and probably one that will not get much attention.
You say you're using onLocationChange so that you can know when to add a load listener. I'm going to guess that you're adding the load listener to the contentDocument since you mentioned it. What you can do instead is add the load listener to the browser itself, much like you would with an iframe. If I replace
newBrowser.addProgressListener(listener);
with
newBrowser.addEventListener("load", function(e) {
console.log('got here! ' + e.target.contentDocument.location.href);
}, false);
then I receive notifications for each browser.

Related

How to detect DOM update completion after AJAX page load in a Chrome Extension?

I'm trying to identify roughly when the DOM is finished updating after a page is loaded via AJAX on any arbitrary website.
My current method first listens for the chrome.webNavigation.onHistoryStateUpdated event in a background script, then executes a content script in which a MutationObserver detects changes to the website's body. From there, unfortunately, it seems like it's a bit more finicky. If I just wait for the first mutation where nodes are added to the DOM, I wind up in many cases (YouTube, to give one example) where the page is still blank. Other more hacky approaches I've considered include things like just using setTimeout or waiting for the page to reach a certain length, but those seem clearly wide open to exception cases.
Is there a more fool-proof way to detect that the DOM has roughly finished updating? It doesn't necessarily have to be perfectly precise, and erring on the side of triggering late in my use case is better than triggering early. Also it isn't important at all that resources like video and images be fully loaded, just that the text contents of the page are basically in place.
Thanks for your help!

Will dataLayer.push() definitely send data to google when triggered on an anchor?

This may seem like a simple question, but it doesn't seem to be answered anywhere that i can find.
I am writing an onClick event handler that simply calls dataLayer.push() when an anchor is clicked.
Is dataLayer.push() a synchronous operation?
Will the GET request to google definitely be sent, even though the browser has unloaded the page it was requested from due to the link being followed?
Some browsers show the connection get cancelled, some show it success.
My question is if the computer is slow, is it possible for the page to get unloaded before the request is sent?
This is why i assume that google started using the eventCallback property to redirect the user after the link has been followed.
e.g.
https://developers.google.com/tag-manager/enhanced-ecommerce#product-clicks
This source code does not include the click handler, but implies that the onClick event should stop propogation and let the eventCallback function set document.location.
However, as soon as you cancel the event, all its information has gone.
This (in my opinion) is just the wrong way to do it.
e.g.
(CTRL or COMMAND) + Click opens a new tab on browsers. This will not work unless the onClick event handler allows the prorogation to continue.
Relying on eventCallback also means that if the google scrips didn't load for one of the many reasons it could (but is still unlikely), your links don't work. And your site is broken.
So this leaves the correct way to do it for the onClick event handler to allow the event to propagate and return true.
Which also means that dataLayer.push() would need return after the GET request was sent for any of this to work properly.
Code example:
NOTE: You will get mixed results in mixed environments.
Link
$(document).on('click', 'a', function(event) {
// Is dataLayer.push() guaranteed to fire a GET ?
// data set externally
dataLayer.push(data);
return true;
});
Is there anyone out there that can guarantee that the GET request will get fired to the google server?
Have the google developers forgotten something here?
EDIT: Updated title to be more relevant to the question.
datalayer.push does not send anything to Google. It pushes objects with key/value pairs to the datalayer array. This might contain an event which in turn fires a tag. Whether the tag is sent depends on the setup of the tag, not on the dataLayer.push.
As a consequence, when you write your own click handlers your are yourself responsible to make sure your tags are actually fired.
If you use the built-in click handler you can configure a delay to make sure your tag has time to fire before the link redirects:
Since link clicks usually cause the browser to load a new page and
interrupt any pending HTTP request, you have the option to add a small
delay to allow tags fired by Tag Manager to execute properly before
redirecting to the next page. Checking the “Wait For Tags” option will
delay opening of links until all tags have fired or the specified
timeout has elapsed, whichever comes first.
You should be able to mix both methods (push data on the click, but still use the "native" link click handler for the event).
You can also try to specify "beacon" as the transport method in your Google Analytics tags, on browsers that support this (which I think is only Chrome at the moment) GA will then use the navigator.sendBeacon interface, which sends the data even in case the page unloads.
You might think that Google's solution is not very elegant (but the simple delay has the advantage that it works for all tags, not just for GA), but they have not "forgotten" the problem.
Also solutions that combine GA hit callbacks with timeouts that redirects if the callback fails as proposed i.e. by Simo Ahava somewhere should be be doable with GTM, even if they are probably more cumbersome to implement in GA.

how can a userscript get notified about ajax-driven changes on the page?

Given items on a webpage that get changed by AJAX calls over time, how can a userscript get notified about those changes, whenever they occur?
Imagine the Facebook newsfeed. It has 12 items in it when you load the page. Those items are wrapped in <li> tags contained within a <ul>. As you scroll the page down, new <li> chunks of data load into that <ul>.
I'm wondering how a userscript could be notified of such a change.
One idea is to constantly query that <ul>, counting its items, and watching to see if that number gets bigger. Possible, but to catch the change right when it happens it might have to run so often that it's too expensive.
Another idea would be to figure out what scroll position triggers the loading, and to watch for such a change. Less expensive, but very specific.
I'm wondering if there's a third option. Something that would notify me of the change, whenever it happens. I'm not just interested in the feed, but in this concept more generally. Given items on a page that get changed by AJAX calls, how can a userscript get notified about those changes?
Hijack the send method
var oldSend = XMLHttpRequest.prototype.send;
XMLHttpRequest.prototype.send = function(){
// do what you need; then send the request
oldSend.apply(this, arguments);
}
I think what you are looking for is the DOMSubtreeModified event.
This works in firefox chrome and IE >= 9, if your scripting on facebook im guessing its for a greasemonkey/chrome extension? if that is the case this should be okay.
This event is fired on a node when ever a child node is added removed or changed
You can use it with
.addEventListener ("DOMSubtreeModified", handler, useCapture);
but I don't think it works with attachEvent.
Here's some more info on it.
http://help.dottoro.com/ljrmcldi.php
Do you have access to the Ajax calls that are updating the pages contents? Generally the better approach is to attach a call back to the actual Ajax call.
If the request are being made with Jquery use $.ajaxComplete() or $.ajaxSuccess() to trigger your code. These will fire any time a request completes so when this happens you can check if the content has changed without it being to expensive.
$.ajaxSuccess(function() { //check for update and do something });

back/next button in dynamically constructed webpage (AJAX)

My question is about using Back and Next buttons (of the browser) on an AJAX (dynamical) webpage.
The solution I self came up with:
setInterval(function(){
if (location.hash != hash)
{
hash = location.hash;
app.url = window.location.href.toString().replace('http://xxxxx.nl/xxxx/#!/','')
app.handleURL();
}
}, 500);
this function reads the url(hash) and compares it with the last stored url(hash), every 0.5 second. If url has changed (back/next is pushed) it runs handleUrl() which runs more functions to dynamically build my page.
the problem is, this sort of works BUT when I click an html [A] element or when I change the url in an other way (javascript), that content will be loaded TWICE because of the setInterval()... functionality.
How can I build my HTML/Javascript in such way that my content will always be loaded once,
once when I push back/next
once when I click on an HTML element/use Javascript functions on
runtime
I searched the sh*t out of google for a solution, plz help!
You don't need a timer to check it. Just use the onhashchange event, and fire your AJAX calls when the event is called. This event isn't supported in IE versions below 8, though, so your method seems fine if you need IE support.
Also, it doesn't make sense that they're being called twice for a elements, since there's no reason for the interval to call your AJAX loader twice just because the hash was changed using an a element. You probably have an event listener attached to the a element which causes it to load the AJAX content, which wouldn't be needed since you're detecting any change in the hash, no matter how it was changed.
I suggest using a library for that. It will be tricky to make your own solution. Take a look at these:
http://www.asual.com/jquery/address/docs/#sample-usage
http://benalman.com/projects/jquery-bbq-plugin/

JQuery domready and window ready randomly working?

I've noticed lately that sometimes the domready and window.load does not work. It's like randomly working when entering the page, and or refreshing.
Say I have:
$(function(){
$('.hide').hide();
// disable html5 native validation to let jquery handle
$('form').attr('novalidate','novalidate');
});
$(window).load(function(){
$('.input').click(function(){
$(this).animate({opacity:0.8});
}).blur(function(){
$(this).animate({opacity:1});
});
});
Sometimes when I load the page, the element is not getting hidden, sometimes it is, the input fields will animate, sometimes not, and both don't necessary fail together. If I refresh the page a few times, it will work.
I always thought that domready will execute as soon as the doms are ready, and window.load will wait until everything on the page is rendered ready? Or is this more bugs from HTML5?
Question is: am I missing something or just misunderstanding something?
Edit: Notably Chromium. I am on Ubuntu, so I would not be surprised if it was a chromium bug.
Be aware that if you have a very complex html structure, it may delay the time for the dom to become ready. The browser probably tries to render the page as quickly as it possibly can, and with a really complex page, it's possible that the rendering will begin and the domready event will trigger, but the browser will quickly render stuff before the specific code that you set up gets triggered.
A block in jQuery domready happens as fast as it can, but if you put, say:
setTimeout(function(){ $().ready(function(){alert('finally');});}, 9000);
That "as fast as it can" is still going to be limited by where the code occurs, in this case after a 9 second timeout.

Categories

Resources