So I want to migrate all my javascript functions to requireJS and will try to use as much as possible the ondomready event. BUT:
this will freeze the browser, since all javascript is synchronously. This is bad. I mean wow the user sees the browser content a bit faster, but is going to try to click somewhere just to realize the browser was frozen, and has to click again. This is very bad. Is there a way around this?
Patient: It hurts when I do this.
Doctor: Then don't do that.
If you see freezing on the dom ready event then perhaps you are trying to do too much. Executing javascript is quick. Doing a page redraw is slow.
Rather than lots of small events that each make changes to the dom and each cause a page redraw you should have one function that processes a list of changes that need to be made. This is what the domReady plugin does before the ready event. After the ready event it just runs them as it receives it which could cause multiple redraws.
I learnt this while writing my own animation library. I was using individual setInterval()'s to change a single property. When doing more that four a once the animation was no longer smooth. A better way to do this is a single interval that processes a list of changes that need to be made.
Edit:
Instead of using domReady as a plugin require(["domReady!"], use it as a module so that you can run initialisation code straight away then make changes to the dom later.
require(["domReady"], function(domReady) {
var element = document.createElement('table');
//more setup code
domReady(function(){
document.body.appendChild(element);
});
});
Related
I'm trying to identify roughly when the DOM is finished updating after a page is loaded via AJAX on any arbitrary website.
My current method first listens for the chrome.webNavigation.onHistoryStateUpdated event in a background script, then executes a content script in which a MutationObserver detects changes to the website's body. From there, unfortunately, it seems like it's a bit more finicky. If I just wait for the first mutation where nodes are added to the DOM, I wind up in many cases (YouTube, to give one example) where the page is still blank. Other more hacky approaches I've considered include things like just using setTimeout or waiting for the page to reach a certain length, but those seem clearly wide open to exception cases.
Is there a more fool-proof way to detect that the DOM has roughly finished updating? It doesn't necessarily have to be perfectly precise, and erring on the side of triggering late in my use case is better than triggering early. Also it isn't important at all that resources like video and images be fully loaded, just that the text contents of the page are basically in place.
Thanks for your help!
Is JavaScript intended to be running as little as possible on a website/webapp? By that I mean is the usual intention to run through all your js files as soon as the page loads and put them aside, and then when functions come up to execute them right away and be done with it?
I'm working on a project using google maps and I have a custom marker object scripted out, and a debugger has told me that the browser runs through all my js files before anything even appears on the page.
My problem comes in here: I wanted to animate certain markers to bounce up and down continuously with jQuery (similar to OS X icons in the dock) and my several attempts at infinite loop functions all just crash the browser. So I understand that the browser doesn't like that, but is there a way to have a simple script be repeating itself in the background while the user navigates the page? Or is JavaScript just not supposed to be used that way?
(I worked with Flash for a long time so my mindset is still there.)
Yes, Javascript functions should just do their bit and exit as soon as possible. The GUI and the scripts run on the same single thread, so as long as you are inside a Javascript function, nothing shows up in the browser. If you try to use an infinite loop, the browser will appear to freeze.
You use the window.setInterval and window.setTimeout methods to trigger code that runs at a specific time. By running an interval that updates something several times a second, you can create an animation.
You have to set a timer to execute a script after a defined time.
var timer = setTimeout(code, milliseconds);
will execute code in so-and-so milliseconds. Each execution of the script can set a new timer to execute the script again.
You can cancel a timed event using clearTimeout(timer).
Use setTimeout() or setInterval(). The MDC articles on it are pretty good.
You'll need to update inside of functions that run quickly, but get called many times, instead of updating inside of a loop.
Since you said that you are using jQuery, consider using its effects API (e.g., jQuery.animate()), it will make your life much easier!
Personally, I save as much code as possible for execution after the page has loaded, partly by putting all my <script>s at the bottom of <body>. This means a (perceived) reduction in page load time, whilst having all my JS ready to run when need be.
I wouldn't recommend going through everything you need to do at the beginning of the document. Instead, bind things to events such as clicks of buttons, etc.
I've noticed lately that sometimes the domready and window.load does not work. It's like randomly working when entering the page, and or refreshing.
Say I have:
$(function(){
$('.hide').hide();
// disable html5 native validation to let jquery handle
$('form').attr('novalidate','novalidate');
});
$(window).load(function(){
$('.input').click(function(){
$(this).animate({opacity:0.8});
}).blur(function(){
$(this).animate({opacity:1});
});
});
Sometimes when I load the page, the element is not getting hidden, sometimes it is, the input fields will animate, sometimes not, and both don't necessary fail together. If I refresh the page a few times, it will work.
I always thought that domready will execute as soon as the doms are ready, and window.load will wait until everything on the page is rendered ready? Or is this more bugs from HTML5?
Question is: am I missing something or just misunderstanding something?
Edit: Notably Chromium. I am on Ubuntu, so I would not be surprised if it was a chromium bug.
Be aware that if you have a very complex html structure, it may delay the time for the dom to become ready. The browser probably tries to render the page as quickly as it possibly can, and with a really complex page, it's possible that the rendering will begin and the domready event will trigger, but the browser will quickly render stuff before the specific code that you set up gets triggered.
A block in jQuery domready happens as fast as it can, but if you put, say:
setTimeout(function(){ $().ready(function(){alert('finally');});}, 9000);
That "as fast as it can" is still going to be limited by where the code occurs, in this case after a 9 second timeout.
I'm probably missing something really obvious here...
I'm showing a dialog box with progress bar during page load. The dialog and progress bar are both jQueryUI widgets. There are a couple of phases of loading - the page makes a load of jQuery $.get() requests to load resources, then on the $(document).ajaxStop() event, does things with those resources. I'm updating the progress bar and some status text throughout this process.
The issue is that as soon as the ajaxStop event fires, updates stop. The code works nicely during resource loading, but then freezes and I don't see any of the updates during processing. If I put a breakpoint on a post-ajaxStop update in Chrome and step through the code, the screen updates correctly so I know that the code works.
Can anyone explain why everything updates nicely during my AJAX loading phase, but then stops on the ajaxStop event? Is there an easy way to make updates continue afterwards?
Thanks!
Several hours of searching later, the following blog pointed me in the right direction:
There's a jQuery extension described in the entry which allows you to define two functions, one to compute and one to update the UI. It schedules them alternately using the setTimeout function.
I've had to rewrite my code in something akin to continuation passing style so that each function schedules its continuation to run using setTimeout. This returns control to the browser for long enough for the screen to be updated.
This feels like a bit of a hack though to get round browser/Javascript limitations. Anyone know of a better way?
How can I have my javascript code constantly run? The situation is that I want some page elements to be resized when the page resizes. I'm thinking that the way to do this would be to have some javascript code that constantly runs, and whenever the page size changes, it resizes that element.
I tried doing this with setTimeout() and a really small value, but it didn't seem to work.
JavaScript is an Event based language, that is you add event listeners to things and then a function is called when that event occurs. This saves you from having a loop run continuously to to check the state of an item.
The window supports onResize in JavaScript, so for example:
window.addEventListener("resize", function(event){
alert("you just resized the window. If you inspect the event variable you will find some usefull details");
}, false);
http://www.quirksmode.org/dom/events/index.html#t020
You should hook your script to the resize event
I would look at a framework like jquery where you can register a function with a page event.
$('body').resize(function() { ... });
By running the javascript all the time, you run the risk of really bogging down a cpu (especially on single core systems) and really slowing down the browser.