W3C specifies a list of event and their corresponding timings that user agents must return if they want to support the Navigation Timing API.
A list you can see here: http://www.w3.org/TR/navigation-timing/#process
Understanding which process relates to which events is pretty straight forward in most cases. But one thing that eludes me is what is going on between domContentLoadedEventStart and domContentLoadedEventEnd.
Here is what I have understood so far and base my reflections on:
domLoading // The UA starts parsing the document.
domInteractive // The UA has finished parsing the document. Users
can interact with the page.
domContentLoaded // The document has been completely loaded and
parsed and deferred scripts, if any, have executed. (Async scripts,
if any, might or might not have executed???)
domComplete // The DOM Tree is completely built. Async scripts, if
any, have executed.
loadEventEnd // The UA has a fully completed page. All resources,
like images, swf, etc, have loaded.
One should be able to deduce what happens after phase #3 (domContentLoaded) by understanding what triggered event #4 (domComplete) but did not trigger previous events.
So one would think that “Async scripts, if any, have executed” means that asynchronous scripts get executed after phase #3 but before event #4. But according to my tests, this is not what happens, unless my test is wrong. (I tried to replicate my test on JSFiddle, but I can’t make the defered/async script work since there is no way to add attribute on external scripts.)
So my question is: What process(es) takes place between domContentLoadedEventStart and domContentLoadedEventEnd?
Those timings have to do with the domContentLoaded event(s). It's similar to the load event with loadEventStart and loadEventEnd. Instead of using load, you use DOMContentLoaded.
For example, adding a DOMContentLoaded event and running some code in it, should give you a different start and end.
document.addEventListener("DOMContentLoaded", function(event) {
var j = 0;
for (var i = 0; i < 10000000; i++) {
j = i;
}
});
Once that event is ran, the navigation timing API will return a different timestamp between the start and end times, depending on how long your event(s) take to run.
From the W3C documentation you pointed out, I believe there are no other processes going on with these timings.
domContentLoadedEventStart attribute
This attribute must return the time immediately before the user agent fires the DOMContentLoaded event at the Document.
domContentLoadedEventEnd attribute
This attribute must return the time immediately after the document's DOMContentLoaded event completes.
Related
I've inherited a codebase where the order in which JS executes is not clear since there's a lot of setTimeout calls, globals, and broken Promise chains. Rather than manually trace every execution path I'd like to capture what JS gets scheduled for execution on the browser's message queue over a time period, or in response to an event.
I can see Event Listeners and trace from when one fires, but this is proving too slow in my case. A single click can sprawl out into several scheduled scripts that each mutate a shared state. This is why I am not considering tracing from event handlers and am instead looking for an overarching timeline for all JS in the application.
Given that JS scripts are scheduled for execution, how I can see the order in which JS gets queued?
I've started with something like this, but this doesn't give me a fully reliable timeline.
const {
setTimeout,
setInterval,
} = window;
window._jsq = [];
window._record = f => {
window._jsq.push([f, new Error().stack]);
};
window.setTimeout = (...a) => {
window._record(a[0]);
return setTimeout.apply(window, a);
};
window.setInterval = (...a) => {
window._record(a[0]);
return setInterval.apply(window, a);
};
I'll take a crack at my own question from the angle of the OP snippet. Corrections appreciated.
Assuming you cannot see the message queue (or at least the scripts queued), you can still see the code that is scheduling other JS and the code that is scheduled to run. So, tracking both independently is possible.
This is not all good news because you still have to do legwork to 1) adapt that tracking to the various ways JS can get scheduled, and 2) make sense of what you capture.
In the setTimeout case, something quick and dirty like this can at least provide a sense of a scheduling timeline and when things actually happened. That's just a matter of wrapping functions.
const { setTimeout } = window;
// For visibility in DevTools console
window._schedulers = [];
window._calls = [];
const wrap = f => {
const { stack } = new Error();
window._schedulers.push([stack, f]);
return (...a) => {
window._calls.push([stack, f, a]);
return f(...a);
};
};
window.setTimeout = (f, delay, ...a) => {
return setTimeout.apply(window, [wrap(f), delay].concat(a));
}
Still, that's just one case and says nothing about when to start/stop monitoring and the potential trigger points where traceability is a concern as Mosè Raguzzini mentioned. In the case of Promises, this answer calls out Bluebird's checking facilities.
It seems that until more native tools come out that visualize queued scripts and related info, you are stuck collecting and analyzing the data by hand.
There is no built-in automatic debugging tool for monitoring your browser event loop.
In order to monitor the browser's event loop you have to explicity monitor the event that are in your interested in and pass it to the (in this case Chrome's) DevTool:
monitorEvents(document.body, "click");
More info about monitoring events in Chrome Dev Tools
Note #1: You don't know how custom events are called. They may not dispatch an event into the DOM (e.g. some libraries implement their own event registration and handling systems) so there is no general way of knowing when event listeners are being called, even if you can track the dispatch of the event.
Some libraries also simulate event bubbling, but again, unless you know the type of event, you can't listen for it.
However, you could implement your own event management system and implement a function to listen for all events for which listeners are set or events dispatched using your system.
Ref: How can I monitor all custom events emitted in the browser?
Note #2: a modern JS approach to events (IE: React/Redux) involves dispatching ACTIONS instead of events. As actions are often logged for time-travel purpose, monitoring events in this case is unnecessary.
I was under the impression that all DOM manipulations were synchronous.
However, this code is not running as I expect it to.
RecordManager.prototype._instantiateNewRecord = function(node) {
this.beginLoad();
var new_record = new Record(node.data.fields, this);
this.endLoad();
};
RecordManager.prototype.beginLoad = function() {
$(this.loader).removeClass('hidden');
};
RecordManager.prototype.endLoad = function() {
$(this.loader).addClass('hidden');
};
The Record constructor function is very large and it involves instantiating a whole bunch of Field objects, each of which instantiates some other objects of their own.
This results in a 1-2 second delay and I want to have a loading icon during this delay, so it doesn't just look like the page froze.
I expect the flow of events to be:
show loading icon
perform record instantiation operation
hide loading icon
Except the flow ends up being:
perform record instantiation operation
show loading icon
hide loading icon
So, you never even see the loading icon at all, I only know its loading briefly because the updates in the chrome development tools DOM viewer lag behind a little bit.
Should I be expecting this behavior from my code? If so, why?
Yes, this is to be expected. Although the DOM may have updated, until the browser has a chance to repaint, you won't see it. The repaint will get queued the same way as all other things get queued in the browser (ie it won't happen until the current block of JavaScript has finished executing), though pausing in a debugger will generally allow it to happen.
In your case, you can fix it using setTimeout with an immediate timeout:
RecordManager.prototype._instantiateNewRecord = function(node) {
this.beginLoad();
setTimeout(function() {
var new_record = new Record(node.data.fields, this);
this.endLoad();
}, 0);
};
This will allow the repaint to happen before executing the next part of your code.
JavaScript is always synchronous. It mimics multi-threaded behavior when it comes to ajax calls and timers, but when the callback gets returned, it will be blocking as usual.
That said, you most likely have a setTimeout in that constructor somewhere (or a method you're using does). Even if it's setTimeout(fnc, 0).
This is a very simple use case. Show an element (a loader), run some heavy calculations that eat up the thread and hide the loader when done. I am unable to get the loader to actually show up prior to starting the long running process. It ends up showing and hiding after the long running process. Is adding css classes an async process?
See my jsbin here:
http://jsbin.com/voreximapewo/12/edit?html,css,js,output
To explain what a few others have pointed out: This is due to how the browser queues the things that it needs to do (i.e. run JS, respond to UI events, update/repaint how the page looks etc.). When a JS function runs, it prevents all those other things from happening until the function returns.
Take for example:
function work() {
var arr = [];
for (var i = 0; i < 10000; i++) {
arr.push(i);
arr.join(',');
}
document.getElementsByTagName('div')[0].innerHTML = "done";
}
document.getElementsByTagName('button')[0].onclick = function() {
document.getElementsByTagName('div')[0].innerHTML = "thinking...";
work();
};
(http://jsfiddle.net/7bpzuLmp/)
Clicking the button here will change the innerHTML of the div, and then call work, which should take a second or two. And although the div's innerHTML has changed, the browser doesn't have chance to update how the actual page looks until the event handler has returned, which means waiting for work to finish. But by that time, the div's innerHTML has changed again, so that when the browser does get chance to repaint the page, it simply displays 'done' without displaying 'thinking...' at all.
We can, however, do this:
document.getElementsByTagName('button')[0].onclick = function() {
document.getElementsByTagName('div')[0].innerHTML = "thinking...";
setTimeout(work, 1);
};
(http://jsfiddle.net/7bpzuLmp/1/)
setTimeout works by putting a call to a given function at the back of the browser's queue after the given time has elapsed. The fact that it's placed at the back of the queue means that it'll be called after the browser has repainted the page (since the previous HTML changing statement would've queued up a repaint before setTimeout added work to the queue), and therefore the browser has had chance to display 'thinking...' before starting the time consuming work.
So, basically, use setTimeout.
let the current frame render and start the process after setTimeout(1).
alternatively you could query a property and force a repaint like this: element.clientWidth.
More as a what is possible answer you can make your calculations on a new thread using HTML5 Web Workers
This will not only make your loading icon appear but also keep it loading.
More info about web workers : http://www.html5rocks.com/en/tutorials/workers/basics/
I will try to explain my actual setup, the idea behind it, what breaks, what I've tried around it.
The context
I have a PHP5.3 backend feeding "events" (an event being a standard array containing some data, among which a unique sequential number) to Javascript (with jQuery 1.7.x). The events are retrieved using jsonp (on a subdomain) and long-polling on the server side. The first event has the id 1, and then it increments with each new event. The client keeps track of the "last retrieved event id", and that value starts at 0. With each long-polling request, it provides that id so the backend only returns events that occurred after that one.
Events are processed in the following manner: Upon being received (through the jsonp callback), they are stored in an eventQueue variable and "the last retrieved event id" is updated to the one of the last event received and stored in the queue. Then a function is called that processes the next queued event. That function checks whether an event is already being processed (through the means of another variable that is set whenever an event is starting to get processed), if there is it does nothing, so the callstack brings us back to the jsonp callback where a new long-polling request is emitted. (That will repeat the process of queueing new events while the others are processed) However, if there is no event currently being processed, it verifies if there are events left in the queue, and if so it processes the first one (the one with the lowest id). "Processing an event" can be various tasks pertinent to my application, but not to the problem I have or to the context. For example, updating a variable, a message on the page, etc. Once an event is deemed "done being processed" (some events make an ajax call to get or send data, in which case this happens in their success ajax callback), a call to a another function called eventComplete is made. That function deletes the processed event from the event queue, makes sure the variable that handles whether an event is being processed is set to false, and then calls the function that processes the event queue. (So it processes the next, lowest id, event)
The problem
This works really well, on all tested major browsers too. (Tested on Internet Explorer 8 and 9, Chrome, Opera, Firefox) It also is very snappy due to the utilization of long polling. It's also really nice to get all the "history" (most events generate textual data that gets appended in a sort of console in the page) of what has happened and be in the exact same state of the application, even after reloading the page. However, this also becomes problematic when the number of events gets high. Based on estimates, I would need to be able handle as many as 30,000 events. In my tests, even at 7,000 events things start to go awry. Internet Explorer 8 stack overflows around 400 events. Chrome doesn't load all events, but gets close (and breaks, not always at the same point however, unlike IE8). IE9 and FF handle everything well, and hang 2-3 seconds while all events are processed, which is tolerable. I'm thinking however that it might just be a matter of some more events before they break as well. Am I being just too demanding of current web browsers, or is there something I got wrong? Is there a way around that? Is my whole model just wrong?
Possible solutions
I fiddled around with some ideas, none of which really worked. I tried forcing the backend to not output more than 200 events at a time and adding the new poll request after all the current queue was done processing. Still got a stack overflow. I also tried deleting the eventQueue object after it's done processing (even though it is empty then) and recreating it, in the hope that maybe it would free some underlying memory or something. I'm short on ideas, so any idea, pointer or general advice would be really appreciated.
Edit:
I had an enlightenment! I think I know exactly why all of this is happening (but I'm still unsure on how to approach it and fix it), I will provide some basic code excerpts too.
var eventQueue = new Object();
var processingEvent = false;
var lastRetrievedEventId = 0;
var currentEventId = 0;
function sendPoll() {
// Standard jsonp request (to a intentionally slow backend, i.e. long-polling),
// callback set to pollCallback(). Provide currentEventId to the server to only get
// the events starting from that point.
}
function pollCallback( data ) {
// Make sure the data isn't empty, this happens if the jsonp request
// expires (30s in my case) and it didn't get any new data.
if( !jQuery.isEmptyObject( data ) )
{
// Add each new event to the event queue.
$.each(data.events, function() {
eventQueue[ this.id ] = this;
lastRetrievedEventId = this.id; // Since we just put the event in the queue, we know it is officially the last one "retrieved".
});
// Process the next event, we know there has to be at least one in queue!
processNextEvent();
}
// Go look for new events!
sendPoll();
}
function processNextEvent() {
// Do not process events if they are currently being processed, that would happen
// when an event contains an asynchronous function, like an AJAX call.
if( !processingEvent )
{
var nextEventId = currentEventId + 1;
// Before accessing it directly, make sure the "next event" is in the queue.
if( Object.prototype.hasOwnProperty.call(eventQueue, nextEventId) )
{
processingEvent = true;
processEvent( eventQueue[ nextEventId ] );
}
}
}
function processEvent( event ) {
// Do different actions based on the event type.
switch( event.eventType ) {
case SOME_TYPE:
// Do stuff pertaining to SOME_TYPE.
eventComplete( event );
break;
case SOME_OTHER_TYPE:
// Do stuff pertaining to SOME_OTHER_TYPE.
eventComplete( event );
break;
// Etc. Many more cases here. If there is an AJAX call,
// the eventComplete( event ) is placed in the success: callback
// of that AJAX call, I do not want events to be processed in the wrong order.
}
}
function eventComplete( event ) {
// The event has completed, time to process the event after it.
currentEventId = event.id; // Since it was fully processed, it is now the most current event.
delete eventQueue[ event.id ]; // It was fully processed, we don't need it anymore.
processingEvent = false;
processNextEvent(); // Process the next event in queue. Most likely the source of all my woes.
}
function myApplicationIsReady() {
// The DOM is fully loaded, my application has initiated all its data and variables,
// start the long polling.
sendPoll();
}
$(function() {
// Initializing my application.
myApplicationIsReady();
});
After looking at things, I understood why the callstack gets full with many events. For example (-> meaning calls):
myApplicationIsReady() -> sendPoll()
And then when getting the data:
pollCallback() -> [ processNextEvent() -> processEvent() -> eventComplete() -> processNextEvent() ]
The part in brackets is the one that loops and causes the callstack overflow. It doesn't happen with a low amount of events because then it does this:
pollCallback() -> processNextEvent() -> processEvent() -> eventComplete() -> sendPoll()
That would be with two events, and the first one containing an asynchronous call. (So it gets to the second event, which doesn't get processed because the first one isn't done processing, instead it calls the polling function, which then frees the whole callstack and eventually the callback from that will resume the activity)
Now it is not easy to fix and it was designed like that in the first place, because:
I do not want to lose events (As in, I want to make sure all events are processed).
I do not want to hang the browser (I can't use synchronous AJAX calls or an empty loop waiting for something to finish).
I absolutely want events to get processed in the right order.
I do not want for events to get stuck in the queue and the application not processing them anymore.
That is where I need help now! To do what I want it sounds like I need to use chaining, but that is exactly what is causing my callstack issues. Perhaps there is a better chaining structure that lets me do all that, without going infinitely deep in the callstack and I might have overlooked it. Thank you again in advance, I feel like I'm making progress!
How about instead of calling functions recursively, use setTimeout(func, 0)?
When looking to improve a page's performance, one technique I haven't heard mentioned before is using setTimeout to prevent javascript from holding up the rendering of a page.
For example, imagine we have a particularly time-consuming piece of jQuery inline with the html:
$('input').click(function () {
// Do stuff
});
If this code is inline, we are holding up the perceived completion of the page while the piece of jquery is busy attaching a click handler to every input on the page.
Would it be wise to spawn a new thread instead:
setTimeout(function() {
$('input').click(function () {
// Do stuff
})
}, 100);
The only downside I can see is that there is now a greater chance the user clicks on an element before the click handler is attached. However, this risk may be acceptable and we have a degree of this risk anyway, even without setTimeout.
Am I right, or am I wrong?
The actual technique is to use setTimeout with a time of 0.
This works because JavaScript is single-threaded. A timeout doesn't cause the browser to spawn another thread, nor does it guarantee that the code will execute in the specified time. However, the code will be executed when both:
The specified time has elapsed.
Execution control is handed back to the browser.
Therefore calling setTimeout with a time of 0 can be considered as temporarily yielding to the browser.
This means if you have long running code, you can simulate multi-threading by regularly yielding with a setTimeout. Your code may look something like this:
var batches = [...]; // Some array
var currentBatch = 0;
// Start long-running code, whenever browser is ready
setTimeout(doBatch, 0);
function doBatch() {
if (currentBatch < batches.length) {
// Do stuff with batches[currentBatch]
currentBatch++;
setTimeout(doBatch, 0);
}
}
Note: While it's useful to know this technique in some scenarios, I highly doubt you will need it in the situation you describe (assigning event handlers on DOM ready). If performance is indeed an issue, I would suggest looking into ways of improving the real performance by tweaking the selector.
For example if you only have one form on the page which contains <input>s, then give the <form> an ID, and use $('#someId input').
setTimeout() can be used to improve the "perceived" load time -- but not the way you've shown it. Using setTimeout() does not cause your code to run in a separate thread. Instead setTimeout() simply yields the thread back to the browser for (approximately) the specified amount of time. When it's time for your function to run, the browser will yield the thread back to the javascript engine. In javascript there is never more than one thread (unless you're using something like "Web Workers").
So, if you want to use setTimeout() to improve performance during a computation-intensive task, you must break that task into smaller chunks, and execute them in-order, chaining them together using setTimeout(). Something like this works well:
function runTasks( tasks, idx ) {
idx = idx || 0;
tasks[idx++]();
if( idx < tasks.length ) {
setTimeout( function(){ runTasks(tasks, idx); },1);
}
}
runTasks([
function() {
/* do first part */
},
function() {
/* do next part */
},
function() {
/* do final part */
}
]);
Note:
The functions are executed in order. There can be as many as you need.
When the first function returns, the next one is called via setTimeout().
The timeout value I've used is 1. This is sufficient to cause a yield, and the browser will take the thread if it needs it, or allow the next task to proceed if there's time. You can experiment with other values if you feel the need, but usually 1 is what you want for these purposes.
You are correct, there is a greater chance of a "missed" click, but with a low timeout value, its pretty unlikely.