I'd like to be informed about a MediaStreamTrack's end. According to MDN an ended event is
Sent when playback of the track ends (when the value readyState changes to ended).
Also available using the onended event handler property.
So I should be able to setup my callback(s) like:
const [track] = stream.getVideoTracks();
track.addEventListener('ended', () => console.log('track ended'));
track.onended = () => console.log('track onended');
and I expect those to be invoked, once I stop the track via:
tracks.forEach(track => track.stop());
// for good measure? See
// https://developer.mozilla.org/en-US/docs/Web/API/MediaStreamTrack/stop#Stopping_a_video_stream
videoElem.srcObject = null;
The problem I'm having is that the callbacks are not invoked. I built the following JSFiddle, where 3 MediaStreams are created in 3 different ways:
getUserMedia
getDisplayMedia
getCaptureStream (canvas element)
I also have 3 buttons which stop all tracks for the respective MediaStream. The behaviour is as follows:
All 3 streams are inactive, the MediaStream's oninactive callback is triggered (in Chrome, seems like Firefox doesn't support this).
All tracks have a readyState of ended after being stopped.
If I stop the screen stream (2. getDisplayMedia) via Chrome's UI, the track ended callback(s) are invoked.
I know that you have to watch out for the track being used by multiple sources, but that shouldn't be the case here, right? Am I missing something obvious?
Since multiple tracks may use the same source (for example, if two tabs are using the device's microphone), the source itself isn't necessarily immediately stopped. It is instead disassociated from the track and the track object is stopped. Once no media tracks are using the source, the source may actually be completely stopped.
Why is the 'ended' event not firing for this MediaStreamTrack?
Because ended is explicitly not fired when you call track.stop() yourself. It only fires when a track ends for other reasons. From the spec:
Fired when...
The MediaStreamTrack object's source will no longer provide any data, either because the user revoked the permissions, or because the source device has been ejected, or because the remote peer permanently stopped sending data.
This is by design. The thinking was you don't need an event when you stop it yourself. To work around it do:
track.stop();
track.dispatchEvent(new Event("ended"));
MediaStream's oninactive callback is triggered (in Chrome, seems like Firefox doesn't support this).
stream.oninactive and the inactive event are deprecated, and no longer in the spec.
As a workaround for that, you can use the similar ended event on a media element:
video.srcObject = stream;
await new Promise(resolve => video.onloadedmetadata = resolve);
video.addEventListener("ended", () => console.log("inactive!"));
Alas, that does not appear to work in Chrome yet, but it works in Firefox.
Related
I am making a game by js and pixi.js, I am having a trouble in pass parameters for function. The code below
newGame()
{
// Some code before, then I get the audio which I want to play
let audio = this.soundsArray[this.shuffleQuestionsInLevel[this.rightAnswer].sound];
// Auto play audio at the begining of game
this.playSound(audio);
// Click to repeat the sound
this.soundBtn.on('pointerdown', this.playSound.bind(this, audio));
}
// Play audio after 5 seconds
playSound(audio)
{
setTimeout(() => audio.play(), 5000);
}
At the first game, everything works perfectly, the exactly sound be played. However, from the second game, the click event this.soundBtn.on('pointerdown', this.playSound.bind(this, audio)); play all the sound in the pass, it mean in the 2nd game, there are 2 sounds be played, in the 3rd game, there are 3 sounds be played.
The code to auto play audio at the begining this.playSound(audio) work well every time. Only sound in this game be played.
I do not know why I call the same function and pass the same parameter but only the code to auto play audio work. I want the click event work exactly like that. Anyone know what's the problem? Thank you.
It looks like you are attaching the event handler when you start a game (when you call newGame(), but you are never detaching it:
// This line attaches the handler, but it attaches a new handler everytime!
this.soundBtn.on('pointerdown', this.playSound.bind(this, audio));
// To understand why, let's write it more explicitly
//
// First a new "listener" function is created from this.playSound by calling bind
const listener = this.playSound.bind(this, audio);
// Then this function is attached as an event handler
this.soundBtn.on('pointerdown', listener);
// But since listener is not the same function as this.playSound anymore
// (because .bind produces a new function) the following line will not work
// and listener will stay attached
this.soundBtn.off('pointerdown', this.playSound);
In order to fix the problem you will most probably need to store the listener function somewhere so that you can detach it later:
newGame() {
// ...
this.__playAudio = this.playAudio.bind(this, audio);
this.soundBtn.on('pointerdown', this.__playAudio);
}
// And then when the game is over
this.soundBtn.off('pointerdown', this.__playAudio);
Or, if the soundBtn supports it, just detach all the pointerdown handlers when a game is over:
this.soundBtn.off('pointerdown');
I've inherited a codebase where the order in which JS executes is not clear since there's a lot of setTimeout calls, globals, and broken Promise chains. Rather than manually trace every execution path I'd like to capture what JS gets scheduled for execution on the browser's message queue over a time period, or in response to an event.
I can see Event Listeners and trace from when one fires, but this is proving too slow in my case. A single click can sprawl out into several scheduled scripts that each mutate a shared state. This is why I am not considering tracing from event handlers and am instead looking for an overarching timeline for all JS in the application.
Given that JS scripts are scheduled for execution, how I can see the order in which JS gets queued?
I've started with something like this, but this doesn't give me a fully reliable timeline.
const {
setTimeout,
setInterval,
} = window;
window._jsq = [];
window._record = f => {
window._jsq.push([f, new Error().stack]);
};
window.setTimeout = (...a) => {
window._record(a[0]);
return setTimeout.apply(window, a);
};
window.setInterval = (...a) => {
window._record(a[0]);
return setInterval.apply(window, a);
};
I'll take a crack at my own question from the angle of the OP snippet. Corrections appreciated.
Assuming you cannot see the message queue (or at least the scripts queued), you can still see the code that is scheduling other JS and the code that is scheduled to run. So, tracking both independently is possible.
This is not all good news because you still have to do legwork to 1) adapt that tracking to the various ways JS can get scheduled, and 2) make sense of what you capture.
In the setTimeout case, something quick and dirty like this can at least provide a sense of a scheduling timeline and when things actually happened. That's just a matter of wrapping functions.
const { setTimeout } = window;
// For visibility in DevTools console
window._schedulers = [];
window._calls = [];
const wrap = f => {
const { stack } = new Error();
window._schedulers.push([stack, f]);
return (...a) => {
window._calls.push([stack, f, a]);
return f(...a);
};
};
window.setTimeout = (f, delay, ...a) => {
return setTimeout.apply(window, [wrap(f), delay].concat(a));
}
Still, that's just one case and says nothing about when to start/stop monitoring and the potential trigger points where traceability is a concern as Mosè Raguzzini mentioned. In the case of Promises, this answer calls out Bluebird's checking facilities.
It seems that until more native tools come out that visualize queued scripts and related info, you are stuck collecting and analyzing the data by hand.
There is no built-in automatic debugging tool for monitoring your browser event loop.
In order to monitor the browser's event loop you have to explicity monitor the event that are in your interested in and pass it to the (in this case Chrome's) DevTool:
monitorEvents(document.body, "click");
More info about monitoring events in Chrome Dev Tools
Note #1: You don't know how custom events are called. They may not dispatch an event into the DOM (e.g. some libraries implement their own event registration and handling systems) so there is no general way of knowing when event listeners are being called, even if you can track the dispatch of the event.
Some libraries also simulate event bubbling, but again, unless you know the type of event, you can't listen for it.
However, you could implement your own event management system and implement a function to listen for all events for which listeners are set or events dispatched using your system.
Ref: How can I monitor all custom events emitted in the browser?
Note #2: a modern JS approach to events (IE: React/Redux) involves dispatching ACTIONS instead of events. As actions are often logged for time-travel purpose, monitoring events in this case is unnecessary.
iOS doesn't allow Web Audio to be used without a user input to trigger it. To get around this, we have a touchend listener that does this:
initAudio: function () {
// create a blank buffer
var buffer = this._atx.createBuffer(1, 1, 22050); // this._atx is the AudioContext
var node = this._atx.createBufferSource();
node.buffer = buffer;
node.start(0); // or noteOn if the browser doesn't support this, removed that check/code for brevity
}
This is working fine in most cases. We have an overlay over the game, which intercepts the first click, and calls the above function. At the end of the function, this._atx.state is "running" (it is "suspended" before the node.start(0) call).
However, if the callback is triggered by a quick swipe on the screen, rather than a tap, it goes through this code, and at the end of the function, the state is still "suspended". It does exactly the same code both times, the only difference is the nature of the user input.
This is the code that adds the listeners:
this._boundSoundTapped = this._onSoundTapped.bind(this);
this._confirmPopup.addEventListener("touchend", this._boundSoundTapped);
this._confirmPopup.addEventListener("click", this._boundSoundTapped);
And the onSoundTapped function:
_onSoundTapped: function(e){
e.stopPropagation();
e.preventDefault();
if(this._soundsPressed === false) {
this._confirmPopup.removeEventListener("touchend", this._boundSoundTapped);
this._confirmPopup.removeEventListener("click", this._boundSoundTapped);
this._soundsPressed = true;
this._player.initAudio();
}
},
I really can't see why the swipe rather than the click would have a different effect, they both trigger touchend, and the same code gets executed either way.
I have come across an issue in the code base I working with in IE11 specifically.
Other versions of IE allow me to listen to a specific event called 'OpenStateChanged' which is fired from Windows Media Player like so.
document.getElementById('video-player').attachEvent("OpenStateChanged",
(newState) =>
)
When running the code in IE11, I am getting the error Object doesn't support property or method 'attachEvent' which I understand as it is no longer supported. So I have modified my code to check if you can use the newer addEventListener.
if (document.getElementById('video-player').addEventListener)
document.getElementById('video-player').addEventListener("OpenStateChanged",
(newState) =>
console.log newState
#MediaOpen state
if newState == 13
//do stuff
)
else
document.getElementById('video-player').attachEvent("OpenStateChanged",
(newState) =>
#MediaOpen state
if newState == 13
//do stuff
)
This is alright, and it runs without throwing errors, however, the event handler function is never fired. I have noticed that things like onclick get changed to click, but I have seen no documentation on event names for Windows Media Player.
Syntax in coffeescript, can provide js alternative if required.
W3C specifies a list of event and their corresponding timings that user agents must return if they want to support the Navigation Timing API.
A list you can see here: http://www.w3.org/TR/navigation-timing/#process
Understanding which process relates to which events is pretty straight forward in most cases. But one thing that eludes me is what is going on between domContentLoadedEventStart and domContentLoadedEventEnd.
Here is what I have understood so far and base my reflections on:
domLoading // The UA starts parsing the document.
domInteractive // The UA has finished parsing the document. Users
can interact with the page.
domContentLoaded // The document has been completely loaded and
parsed and deferred scripts, if any, have executed. (Async scripts,
if any, might or might not have executed???)
domComplete // The DOM Tree is completely built. Async scripts, if
any, have executed.
loadEventEnd // The UA has a fully completed page. All resources,
like images, swf, etc, have loaded.
One should be able to deduce what happens after phase #3 (domContentLoaded) by understanding what triggered event #4 (domComplete) but did not trigger previous events.
So one would think that “Async scripts, if any, have executed” means that asynchronous scripts get executed after phase #3 but before event #4. But according to my tests, this is not what happens, unless my test is wrong. (I tried to replicate my test on JSFiddle, but I can’t make the defered/async script work since there is no way to add attribute on external scripts.)
So my question is: What process(es) takes place between domContentLoadedEventStart and domContentLoadedEventEnd?
Those timings have to do with the domContentLoaded event(s). It's similar to the load event with loadEventStart and loadEventEnd. Instead of using load, you use DOMContentLoaded.
For example, adding a DOMContentLoaded event and running some code in it, should give you a different start and end.
document.addEventListener("DOMContentLoaded", function(event) {
var j = 0;
for (var i = 0; i < 10000000; i++) {
j = i;
}
});
Once that event is ran, the navigation timing API will return a different timestamp between the start and end times, depending on how long your event(s) take to run.
From the W3C documentation you pointed out, I believe there are no other processes going on with these timings.
domContentLoadedEventStart attribute
This attribute must return the time immediately before the user agent fires the DOMContentLoaded event at the Document.
domContentLoadedEventEnd attribute
This attribute must return the time immediately after the document's DOMContentLoaded event completes.