I have this in page:
<video src="blob://some-live-stream" autoplay></video>
<div id="hideMePlease"> hide 1 sec before video ends</div>
I would like to hide the div 1 sec before the video ends, how can i do?
N.B: i can't know video duration, it's a live stream , and the video autostops so i have no way to stop it myself.
If, as you state, you cannot know the length of the video because it's streaming, it will be impossible (relativistic time travel notwithstanding) for you to schedule an event one second before it finishes.
However, I would at least first try to use the duration property of the video, it may be that metadata is made available as part of the stream early on. If it is, you can use that to schedule the hiding of your div.
As an aside, if you visit the page http://www.w3.org/2010/05/video/mediaevents.html, you'll find that the duration is set correctly as soon as you start playing the video, despite the fact it seems to be streaming from either an MP4, OGG or WEBM file). So it's at least possible, though it may depend on the data stream itself (whether the metadata comes before or after the actual video data).
If the data is not available (I think you get Inf for the duration in that case), then you're just going to have to hide the div at the earliest possible time after that.
That would presumably be when it finishes (fires the onended event).
So, in short, if you can get the duration (or periodically get the time remaining which might be better), use that. Otherwise fall back to hiding it at the end and start hassling w3c to provide the functionality you want in HTML6.
Related
Hi i am calling two different web services from my angular script. First service is returning text which i am using as an input to second text to audio service. Since in some case text is long string and it takes time to load the audio file, but meanwhile text output prints on screen. My requirement is to print the text output on screen when audio playback is available. I am using html 5 tag for audio playback. Please let me know if there is any other way to do the same.
You need to add an event listener to the 'canplaythrough' event on the audio element. Register a callback which displays the current text and plays the audio once this is the case.
There isn't actually an event available which fires when media has loaded completely; canplaythrough is just an estimate by the browser that it will play without interruptions bases on the amount that is buffered and the current download speed. This isn't exactly foolproof but it's the best option available.
Is there are any way to detect the change of time in getCurrentTime() of YouTube API?
Purpose:To play single video for limited time(<1 min) in my page and move on to next video.
I was thinking to use object.watch but it only works for the variables not the functions.
I also try to bind something in the original source code https://s.ytimg.com/yts/jsbin/www-widgetapi-vfl9twtxR.js but it was too complicated.
Because the YouTube player is wrapped in an iFrame, you are dependent on what the player exposes, and while they've exposed the current time via the getCurrentTime() function, they haven't exposed any events raised whenever the time might get updated (in fact, the time is only updated to the exposed function 6 or 7 times per second, and it isn't always consistent).
So your only option is to set up a javascript timer. Lots of ways to do that; a simple one would be like this:
setInterval(function(){
// here you'd raise some sort of event based on the value of getCurrentTime();
},100); // polling 8 times a second, to make sure you get it every time it changes.
As the postMessage from the iFrame isn't always perfectly consistent, you could have this interval timer be greater. Or you could use something like requestAnimationFrame to poll 60 times a second.
The first example at
https://developers.google.com/youtube/iframe_api_reference#Getting_Started
gives you almost completely exactly what you need to get started. The code there is starting a timer once the video starts then waits 6 seconds to stop the video. What you'd want to do instead of stopping the video after 6 seconds is to fire a new YT.Player instance on the same div after 60 seconds.
There is a web page which has HTML5 video in it. When the user clicked start or when he navigates through the timeline, the video starts (either from start or from the position he selected). But it does not always happens instantly. I wanted to find how much time did it took from the user click event and the time the user received first bytes of the video.
Getting time of userclick is not a problem, but while looking through HTML5 video API here and I was not able to find any event which is close to what I am looking for.
Is it possible to tack such event?
The event(s) you listen for after you receive the click (or "play" or "seeking") event depends on the state of the video before the time of the click.
If you have a fresh, unplayed video element with the preload attribute set to "none", then the first data you're going to receive from the network is the metadata. so you can listen for the "loadedmetadata" event.
If preload is set to "metadata", you might have already loaded metadata, depending on the browser and platform. (e.g., Safari on iPad will not load metadata or anything else until the first user interaction.) In that case, you want to listen for either "loadedmetadata" or "progress". It couldn't hurt to listen for "loadeddata" as well, but I think "progress" fires first.
If preload is set to "auto" or if you've already played some of the video, you might have some actual video data. And while you're likely to have data at the current point on the timeline, you may or may not have it at the seek destination. It depends at least on how far ahead (or behind) you're seeking, how fast data is coming in and how much spare room the browser has in the media cache.
If there is no data at the destination time (you can check this in advance if you want with the .buffered property, see TimeRanges), then the next event you see will be either "loadeddata" or "progress", probably followed by "canplay". If there is enough data buffered at the target time of the seek, then the question doesn't really apply because nothing else will be transferred.
However, in any of the above cases, once there is enough data to display the frame at the new point on that timeline and that data has been decoded, the "seeked" event will fire. So if you were to only pick one (no reason you can't use more), this is the one to pick.
I'm developing a Javascript music app.
Offline rendering works fine, i.e. generate a buffer, play it at a given time using an AudioBufferSourceNode. Timing is almost perfect.
But I need to generate tones that cannot be created using the default nodes of the API. So I put my sound generating logic inside the callback of a ScriptProcessorNode.
I'm fairly certain that the code in my ScriptProcessorNode is fast enough, because once started the sound plays without a glitch for any number of buffer periods that I want - so filling the buffer in time is probably not the issue here. From my experiments I figured out that the onaudioprocess event of the ScriptProcessorNode is fired at regular intervals, not depending on when the processor node was created. This creates unpredictable latency in the app: if the user presses a key right after a callback has started then it waits till the next period to play.
I've created this fiddle to demonstrate it. There is a simple instrument and two buttons to control it. One plays a pre-recorded buffer:
function playBuffer()
{
source = ac.createBufferSource();
source.buffer = buffer;
source.connect(ac.destination);
source.start(0);
};
and the other one plays the same sound but live:
function playLive()
{
processor = ac.createScriptProcessor(4096, 0, 1);
processor.onaudioprocess = function(e)
{
sineStrument.fillBuffer(e.outputBuffer.getChannelData(0), e.outputBuffer.length);
}
processor.connect(ac.destination);
};
Using the first button you can generate a rhythm and hear that it works flawlessly. Using the second button you can't because it takes around 50+ms for the sound to start.
Now notice that the instrument is really simple, I don't think I have a speed of computation issue here. Plus, if you time it right, you can get the live processed sound to play in sync with your clicks - I figure you "only" need to click just before the onaudioprocess callback is called.
The facts that 1) the playBuffer function plays immediately and 2) it is possible to get the correct timing with the playLive function tell me that there should be a technical way to get a ScriptProcessorNode timed right.
How can I do it? How come playing a buffer doesn't have fixed starting times?
I've tried reducing the buffer size of the ScriptProcessorNode too, but sound gets distorted very quickly. Why is it so? If the code in the callback wasn't fast enough wouldn't the sound have glitches after a while? It does not!
Connect your ScriptProcessorNode to a GainNode with gain value initialized to 0. Make the "Play Live" button set the gain value to 1 when pressed, 0 when released.
I don't know how this happens and I can't see any errors.
I can't seem to navigate through the video the second time I open my page.
See screenshot here:
I have found this error it says,
TypeError: Floating-point value is not finite.
"Video is not ready. (Video.js)"
Help would be really appreciated.
Thanks
When you say "I can't seem to navigate through the video the second time I open my page"? Do you mean you aren't able to play the video at all or that you aren't able to fast-forward and rewind within the playing video?
That you are getting a Type Error: Floating-point value is not finite error means that a certain parameter you're supplying to video.js is of the wrong type. I.e. you probaby supply video.js with a string when it wants an integer (or something similar).
Because it works the first time you load the page, are you trying to resume playback where you left off when you navigated away from the page?
If that's the case you are probably storing the currentTime parameter in a cookie or localStorage value as a string (using jQuery cookies for example these usually get automaticalyl stringified) and forgetting to switch it back to an int when you need video.js to read it back to you. Because what I notice about your screenshot is it seems video.js doesn't know the duration of your video (i.e. the seek time says 0:31 / 0:00)
Here's what you should do:
Clear you cache to get a working first time player load then:
Before starting play back, after playback has finished, and during playback you should log the current playback time signature, i.e.: console.log( videojs("id-of-your-video").currentTime() )
Adding time signature console.logs() to these video.js event callbacks should help you:
durationchange (fired if the total duration of your video changes)
duration (tells you the duration of your video. Try logging this while it works and again after it stops working and see what's changed in the value)
If you're still having trouble try logging values using the video js timeupdate callback. This event is called when the current playback position has changed (can be several times a second). If all else fails this callback might give you some insight into the exact moment you're getting the invalid type value, but it won't help you if you're problems are with trying to resume playback from a .currentTime() value reading from an incorrect type stored in user cookie / sessionStorage / localStorage etc.
Are you Using a Server to execute it or are you running it locally. Because I got some similar issues when I ran it locally. But When I put the files in some server like tomcat or iis or whatever servers it worked. Not sure about the issue just a guess