I have just started using the audio elements buffered attribute. I have only tested it in google chrome (canary build). So far the data in the structure has been different from what the default audio interface displays.
I have made two jsfiddles for the two ways I have implemented it
The first is the way I would imagine it to work http://jsfiddle.net/VB7Z8/20/
The second the way it actually seems to work http://jsfiddle.net/VB7Z8/18/
To see what I mean go to each fiddle (making sure your cache is cleared so it has stuff to buffer) and seek to the right side of the player making sure to leave a gap for it to buffer. I would think that now if you go back to the hole in the middle that the audio would not play but in fact it has been buffered only the buffered object has not been updated.
I have tested it with all the events and I still get the same result. Have I done something wrong or is this a problem with google chrome? I realise this is still in a draft stage of the standard.
This appears to have been changed at some point in the chrome development and it now works as the standard explains
Related
I'm using Chrome's desktopCapture in an extension and I have an issue that I'm attempting to work around. Any help would be greatly appreciated. I cannot post any source, but the chrome extension itself is commonly available and used on the web.
Issue
The issue is with resize / dimension changes that may occur while desktopCapture is capturing / streaming to the server. These changes can often occur seemingly too fast for my client to handle, causing the client application to crash.
Solution
I'd like to get some event or notification when the capturing end detects a resize of the area being captured; for instance a window which has been clicked and is being dragged to resize it.
An alternative would be if the event.data can be queried for width / height.
Research
I've google'd and searched the chrome / webrtc issues; I've come up empty thus far. There really isn't any good implementation information available from what I've found.
Going through the Chromium codebase is not an option for me; I am not a C/C++ developer.
What I would like from You
If you have experience with the desktopCapture offering, please share what you know. If you don't have any idea what I'm asking or have nothing constructive to add, please ignore this and move on.
Commentary
As of July 17th 2015, it would appear that there is a bug or missing support for resize events in Chromes desktopCapture extension. I will file an enhancement request with them and see where that goes. It probably won't help that "normal" WebRTC streams aren't "expected" to change dimensions during streaming and thus it is not handled.
Attach the captured stream to a video element and listen for the onresize (onsize?) event. Should also work for hidden elements if you don't want to display something at the capturing end.
I'm working on a project right now and I've written some code using Pixi.js that produces strange results in Google Chrome. Specifically, drawing a sprite with a texture seems to be creating a strange issue where multiple loaded textures are drawn on top of each other, when only one was requested. (e.g. I say "load a cat, load a dog, draw a cat" and for some reason I see a cat on top of a dog.)
I don't see this issue in Firefox or in Safari. I'm not sure if this is my own bug, a bug in Pixi.js, or a bug in the browser. It doesn't really matter, because that's not really what this question is about-- I'm just telling this story for context.
My question is: what is the general workflow for determining whether or not a bug is my own, or a problem with the browser? Is there some standard process for debugging browsers?
I'm not sure if there's a standard process, but from my experience with PIXI, I've found that when the bug is in my code it generally shows up in all the browsers.
Browsers often have differences in HTML/CSS, but they seem to display the canvas the same. So if the issue is with the general layout of the entire canvas or other DOM elements, I would assume it's a browser issue.
But if the problem is with rendering a PIXI component on the stage, it is more likely either a PIXI bug or a bug in your code. Keep in mind PIXIjs renders using WebGL if available, otherwise it falls back on the HTML5 canvas. So I would check that first by turning WebGL on and off within the same browser and see if it makes a difference.
If you're curious to know why your dog didn't render correctly, go ahead and post some code :)
I'm trying to create an audio playlist with jPlayer that will preload the next song in the list for seamless transitions, but I'm having some trouble with the looping feature.
I've got my code set up here: http://jsfiddle.net/Mz74e/
It uses 2 different players in order to preload the clips. The graphic UI is just shown for debugging, since I don't plan on using it in my final setup.
The current setup seems to loop properly in Firefox 12 and the flash player, but doesn't in Google Chrome 18 (It just goes to the beginning of the clip and sits there)
If anyone can look at how I'm using it and either figure out a workaround of a better implementation, that would be great!
Turns out that the problem was not where I originally though it was.
This exact same code works fine if the audio file is statically served by the web server.
I found this code here for the rangeDownload function:
http://mobiforge.com/developing/story/content-delivery-mobile-devices
Using this, the files are distributed properly to be played by html5.
I am trying to create a feature where a user can change (back and forth) between multiple videos while maintaining a single consistent audio. Think of being able to watch a concert from multiple angles but listening to a single audio. The trouble I am having with this feature is that there can not be a lag between the changes in video or the audio will no longer sync with the videos (especially true after multiple changes).
I have tried two methods, both using html5 only (I would prefer not use flash although I will eventually have a fallback) that have not worked seamlessly, although depending on the browser and hardware, it can come very close.
Basic Methods:
Method 1: Preloading all videos and changing the video src path on each click using javascript
Method 2: Again preloading video and using multiple tags and changing between them using javascript on each click.
Is there anyway to get either of these two methods to work seamlessly without a gap? Should I be using a slight of hand trick, like playing both videos concurrently for a second before revealing the second and stoping the first? Can this just not be done with html5 players? Can it be done with flash?
I have seen this type of question a couple of times with both video and audio with no clear solution, but they were a couple of months old and I was hoping there is now a solution. Thanks for the help.
Worth adding that it is possible with the MediaSource API proposed by Google. This API allows you to feed arbitrary binary data to a single video element, thus if you have your video split into chunks you can fetch those chunks via XHR and append them to your video element, they'll be played without gaps.
Currently it's implemented only in Chrome and you need to enable Enable Media Source API on <video> elements in chrome:flags to use it. Also, only WebM container is currently supported.
Here is an article on HTML5Rocks that demonstrates how the API works: "Stream" video using the MediaSource API.
Another useful article that talks about chunked playlist: Segmenting WebM Video and the MediaSource API.
I hope this implementation gets adopted and gets wider media container support.
UPDATE JUN 2014 Browser support is slowly getting better: (thanks #Hugh Guiney for the tip)
Chrome Stable
FF 25+ has a flag media.mediasource.enabled [MDN]
IE 11+ on Windows 8.1 [MSDN]
Did you find a better way to do that?
I implemented a double-buffered playback using two video tags.
One is used for the current playback, and the second for preloading the next video.
When the video ends I "swap" the tags:
function visualSwap() {
video.volume = video2.volume;
video2.volume = 0;
video.style.width = '300px';
video2.style.width = '0px';
}
It has some non-deterministic behavior, so I am not 100% satisfied, but it's worth trying...
Changing the SRC tag is fast, but not gapless. I'm trying to find the best method for a media player I'm creating and preloading the next track and switching the src via "ended" leaves a gap of about 10-20ms, which may sound tiny, but it's enough to be noticable, especially with music.
I've just tested using a second audio element which fires off as soon as the first audio element ends via the event 'ended' and that incurred the same tiny gap.
Looks like (without using elaborate hacks) there isn't an simple(ish) way of achieving gapless playback, at least right now.
it is possible. you can check this out: http://evelyn-interactive.searchingforabby.com/ it's all done in html5. they are preloading all videos at the beginning and start them at the same time. didn t had time yet, to check how they re doing it exactly, but maybe it helps if you check their scripts via firebug
After many attempts, I did end up using something similar to Method 2. I found this site http://switchcam.com and basically copied their approach. I pre-buffered as the video start time approached and then auto-played as the videos starting point hit. I had the current videos playing simultaneously (in a little div - as a UI bonus) and users could toggle between the videos switching the "main screen view". Since all videos were playing at once, you could choose the audio and the gap didn't end up being an issue.
Unfortunately, I never ended up solving my problem exactly and I am not sure my solution has the best performance, but it works okay with a fast connection.
Thanks for everyones help!
For example, I want the page to play an audio file while at the same time have some bullets slide into view at just the right moment that said bullet is talked about in the audio file. A similar effect would also be used for closed captioning. When I say reliable I mean specifically that the timing will be consistent across many common platforms (browser/OS/CPU/etc) as well as consistent in different sessions on the same platform (they hit refresh, it works again just as it did before, etc).
NOTE: It's OK if the answer is 'NO', but please include at least a little quip about why that is.
Check out this animation, which synchronizes a 3D SVG effect to an audio file.
The technique is explained in a blog post at http://mrdoob.com/blog/page/3. Look for the one entitled "svg tag+audio tag = 3D waveform". The key is to create a table of volume values corresponding to the audio file.
You'll obviously have some work to do in studying this example and the Javascript it uses to adapt it to your scenario. And it will probably only work in browsers that support HTML5.
Given the current situation and HTML5 support, I would solve this using Flash.