HTML5 audio starts from the wrong position in Firefox - javascript

I'm trying to play an mp3 file and I want to jump to specific location in the file. In Chrome 33 on Windows, the file jumps the correct position (as compared with VLC playing the mp3 locally) but in Firefox 28 on Windows it plays too far forward and in Internet Explorer 11 it plays too far behind.
It used to work correctly in Firefox 27 and earlier.
Is there a better way of doing this?
EDIT: The problem doesn't even require SoundManager2. You can replicate the same issue with just the <audio> tag in Firefox. These two lines are all the code you need to reproduce it:
<audio autoplay id="audio" src="http://ivdemo.chaseits.co.uk/enron/20050204-4026(7550490)a.mp3" controls preload></audio>
<button onclick="javascript:document.getElementById('audio').currentTime = 10;">Jump to 10 secs "...be with us in, er, 1 minute... ok" </button>
Try it here: http://jsfiddle.net/cpickard/29Gt3/
EDIT: Tried with Firefox Nightly, no improvement. I have reported it as bug 994561 in bugzilla. Still looking for a workaround for now.

The problem lies in the VBR encoding of the mp3.
Download that mp3 to disk and convert it to fixed bit rate, say with Audacity.
Run the example from disk:
<audio autoplay id="audio" src="./converted.mp3" controls preload></audio>
<button onclick="javascript:document.getElementById('audio').currentTime = 10;">
Jump to 10 secs "...be with us in, er, 1 minute... ok" </button>
and it works fine for me.
So my suggestion for workaround is is to upload an alternative fixed-bit mp3 file in place of the one you are using. Then it should work in the current FFx.

I work on SoundJS and while implementing audio sprites recently ran into similar issues. According to the spec, setting the position of html audio playhead can be inaccurate by up to 300ms. So that could explain some of the issues you are seeing.
Interestingly, your fiddle plays correctly for me in FF 28 on win 8.1 if I just let it play through from the start.
There are also some known issues with audio length accuracy that may also have an effect, which you can read about here.
If you want precision, I would definitely recommend using Web Audio where possible or a library like SoundJS.
Hope that helps.

I met the same issue, and I solved it by converting my MP3 file to the CBR(Constant Bit Rate) format. Then, it can solve the inconsistent issue between the currentTime and the real sound.
Choose the CBR format
Steps:
Download and install "Audacity" (it's a free for any platform)
Open your MP3 file
Click [File] -> [Export] -> [Options] -> [Constant] (See: Converting MP3 to Constant Bit Rate)
Audacity will ask you to provide the LAME MP3 encoder
(See: [download and install the LAME MP3 encoder])
There will be no inconsistent/asynchronous issue.
Also see:
HTML5 audio starts from the wrong position in Firefox
Inconsistent seeking in HTML5 Audio player
tsungjung411#gmail.com

I just tried your code with another audio url here, it seemed to work and i did not experience a delay of any sort in Firefox( v29) which i did previously.
<audio autoplay id="audio" src="http://mediaelementjs.com/media/AirReview-Landmarks-02-ChasingCorporate.mp3" controls preload></audio>
I guess to jump around an audio file, your server must be configured properly.
The client sends byte range requests to seek and play certain regions of a file, so the server must response adequately:
In order to support seeking and playing back regions of the media that
aren't yet downloaded, Gecko uses HTTP 1.1 byte-range requests to
retrieve the media from the seek target position. In addition, if you
don't serve X-Content-Duration headers, Gecko uses byte-range requests
to seek to the end of the media (assuming you serve the Content-Length
header) in order to determine the duration of the media.
Hope this helps..
You could also try looking into Web Audio API for sound-effect-like playback which gives you some guarantees about the playback delays.

After testing the fiddle it is noticable that there is some issue with FF , anywho , after searching sometime , the issue is due to "Performance lag" , but the good news is that someone has found a solution to that issue , you may want to read this :
http://lowlag.alienbill.com/
a single script will solve it all.

Related

Displaying mp4 streamed video in browser

I'm trying to display a continuous video stream (live-stream) in a browser.
Description:
My client reported a video stream doesn't work in the Chrome browser. I thought it will be an easy case, I even tried to write a demo, to prove streaming should be available with just HTML5 native video tag:
https://github.com/mishaszu/streaming-video-demo
No problems with random video but:
the particular video stream on client-side doesn't work.
With html code:
<video id="video-block" width="320" height="200" autoplay>
<source src="url/to/my/video" type="video/mp4">
</video>
it shows loader for a while and dies.
What I know about the stream:
1. Codec used: H264-MPEG-4 AVC (part 10) (avc1)
2. It's a live stream, not a file, so I can't use command like MP4Box from a terminal with it
3. Because it's live stream it probably doesn't have "end of file"
4. I know it's not broken because VLC is able to display it
5. I tried native HTML 5 video tag with all Media type strings (just in case to check all codecs for mp4)
As I mentioned trying different mime types didn't help, I also tried to use MediaSource but I am really not sure how to use it with a live stream, as all information I found made assumptions:
a) waiting for resolve promise and then appends buffer
b) adding the event listener for updateend to appends buffer
I think in the case of a live stream it won't work.
Conclusion:
I found a lot of information about how a streamed file might contain metadata (at the beginning of the file or at the end)... and I ended up with a conclusion that maybe I do not fully understand what's going on.
Questions:
What's the proper way to handle the mp4 live stream?
If a native HTML video tag should support the live stream, how to debug it?
I thought that maybe I should look for something like HLS but for mp4 format?
I've went through the same - I needed to mux an incoming live stream from rtsp to HTML5 video, and sadly this may become non-trivial.
So, for a live stream you need a fragmented mp4 (check this SO question if you do not know what that is:). The is the isobmff specification, which sets rules on what boxes should be present in the stream. From my experience though browsers have their own quirks (had to debug chrome/firefox to find them) when it comes to a live stream. Chrome has chrome://media-internals/ tab, which shows the errors for all loaded players - this can help debugging as well.
So my shortlist to solve this would be:
1) If you say that VLC plays the stream, open the Messages window in VLC ( Tools -> Messages ), set severity to debug, and you should see the mp4 box info in there as the stream comes in, verify that moof boxes are present
2a) Load the stream in chrome, open chrome://media-internals/ in a new tab and inspect errors
2b) Chrome uses ffmpeg underneath, so you could try playing the stream with ffplay as well and check for any errors.
2c) You are actually incorrect about mp4box - you could simply load a number of starting bytes from the stream, save to a file and use mp4box or other tools on that (at worst it should complain about some corrupted boxes at the end if you cut a box short)
If none of 2a/2b/2c provide any relevant error info that you can fix yourself, update the question with the outputs from these, so that others have more info.

Seeking in HTML5 video with Chrome

I'm having issues seeking in video's using Chrome.
For some reason, no matter what I do, video.seekable.end(0) is always 0.
When I call video.currentTime = 5, followed by console.log(video.currentTime), I see it is always 0, which seems to reset the video.
I've tried both MP4 and VP9-based webm formats, but both gave the same results.
What's more annoying is that Firefox runs everything perfectly. Is there something special about Chrome that I should know about?
Here's my code (which only works in Firefox):
<div class="myvideo">
<video width="500" height="300" id="video1" preload="auto">
<source src="data/video1.webm" type="video/webm"/>
Your browser does not support videos.
</video>
</div>
And here's the javascript:
var videoDiv = $(".myvideo").children().get(0)
videoDiv.load();
videoDiv.addEventListener("loadeddata", function(){
console.log("loaded");
console.log(videoDiv.seekable.end(0)); //Why is this always 0 in Chrome, but not Firefox?
videoDiv.currentTime = 5;
console.log(videoDiv.currentTime); //Why is this also always 0 in Chrome, but not Firefox?
});
Note that simply calling videoDiv.play() does actually correctly play the video in both browsers.
Also, after the movie file is fully loaded, the videoDiv.buffered.end(0) also gives correct values in both browsers.
It took me a while to figure it out...
The problem turned out to be server-side. I was using a version of Jetty to serve all my video-files. The simple configuration of Jetty did not support byte serving.
The difference between Firefox and Chrome is that Firefox will download the entire video file so that you can seek through it, even if the server does not support http code 206 (partial content). Chrome on the other hand refuses to download the entire file (unless it is really small, like around 2-3mb).
So to get the currentTime parameter of html5 video to be working in Chrome, you need a server that supports http code 206.
For anyone else having this problem, you can double check your server config with curl:
curl -H Range:bytes=16- -I http://localhost:8080/GOPR0001.mp4
This should return code 206. If it returns code 200, Chrome will not be able to seek the video, but Firefox will, due to a workaround in the browser.
And a final tip: You can use npm http-server to get a simple http-server for a local folder that supports partial content:
npm install http-server -g
And run it to serve a local folder:
http-server -p 8000
Work around if modifying server code is unfeasible. Make an API call for you video, then load the blob into URL.createObjectURL and feed that into the src attribute of your video html tag. This will load the entire file and then chrome will know the size of the file allowing seeking capabilities to work.
axios.get(`${url}`, {
responseType: "blob"
})
.then(function(response) {
setVideo(URL.createObjectURL(response.data));
})
.catch(function(error) {
// handle error
console.log(error);
});
If you wait for canplaythrough instead of loadeddata, it works.
See this codepen example.
You have 3 possibilities for the Video tag: MP4, OGG, WebM.
Not all formats work in all browsers.
Here, I'm thinking that WebM works in Firefox but not Chrome, so you should supply both alternative formats for MP4 and WebM files, by including a 2nd Source tag referring to the MP4 file.
E.g. src="data/video1.mp4" type="video/mp4"
The relevant version will be automatically selected by the browser.
I had a similar problem. I was listening for an the end event on a video and setting currentTime to the middle of the video to loop it continuously. It wasn't working in Safari or Chrome.
I think there may be a bug in Safari/Chrome where playhead position properties aren't available unless the media is currently playing.
My workaround was to start my loop just before the video end and not letting it actually end.
Try testing yours by starting playback first and then run your fuction to see if it works in Safari Chrome.

javascript: reading id3 tags from mp3 stream

I'm writing a Webradio Player for Firefox OS (mobilephone OS based on webstandards). Now I want to add a feature, that displays for example the actual Title like some radiostations are sending and VLC-Mediaplayer for example is able to display. All tested streams are using MP3. I'm playing the audio via the html-audio-tag. Until now i've tested https://github.com/aadsm/JavaScript-ID3-Reader and http://ericbidelman.tumblr.com/post/8343485440/reading-mp3-id3-tags-in-javascript. The JavaScript-ID3-Reader seams not to be able to handle streams. The other way doesn't writes an log via "console.log(title);". Does anybody know a way to add this feature?
Thanks
I've found out that it is impossible this way. But there is a much easier way:
var meta = document.getElementById("audio").mozGetMetadata();
The id audio refers to a audio tag. You can acces the title by meta.title. But there is a bug in firefox (firefox os, too) that causes that this isn't working with mp3-streams: https://bugzilla.mozilla.org/show_bug.cgi?id=908902

HTML5 MP3 <audio> not loading correctly

Ok, my new website has just gone live, delivered through Google Apps. On a lark, I decided to include a javascript / HTML5 Lunar Lander clone (Martian Lander) which I wrote as an exercise a while back. The game works fine when I open it locally, but when it's delivered through GAE, the sounds don't seem to load on every system. In mobile safari, none of them load. In safari on the desktop, they all load reliably on my computer, but not on some other computers. In Chrome (on the desktop) it seems to work, but in Chrome in iOS, only one sound loads. On the desktop, it always seems to be the same sound which fails to load (explode1.mp3), which is the smallest of the sounds I'm loading. As you can see, if you click that link, the sound downloads fine from the server...
At first the problem seemed to be related to case sensitivity, so I switched the case in the filename, but that fix didn't keep working. This is a problem, as my loading bar is directly tied to how many resources have loaded, so it just sits there waiting for a GET request with no reply... Has anyone experienced anything like this, where a GET receives no reply on a specific resource, but loading the resource directly works fine?
I should say that I'm very new to most of these technologies, so it seems quite likely to me that I just made some novice mistake. Unfortunately, I'm not sure what those novice mistakes would be, seeing as I'm a novice!
Here's the code I use to load the sounds:
function loadSound(soundName) {
var newElement = document.createElement("audio");
newElement.addEventListener("canplaythrough", assetLoaded, false);
document.body.appendChild(newElement);
var audioType = supportedAudioFormat(newElement);
if (audioType == "") {
alert("no audio support");
return;
}
newElement.setAttribute("src", "lander/sounds/" + soundName + "." + audioType);
console.log("loading sound " + newElement.src + "...");
return newElement;
}
and...
function assetLoaded() {
var assetName = this.src;
numAssetsLoaded++;
console.log("loaded asset " + numAssetsLoaded + " (" + assetName + ")");
if (numAssetsLoaded >= numAssetsToLoad) {
shipSpriteSheet.removeEventListener("load", assetLoaded, false);
pointImage.removeEventListener("load", assetLoaded, false);
thrustAudioElement.removeEventListener("canplaythrough", assetLoaded, false);
explosionAudioElement.removeEventListener("canplaythrough", assetLoaded, false);
victoryAudioElement.removeEventListener("canplaythrough", assetLoaded, false);
musicTrackElement.removeEventListener("canplaythrough", assetLoaded, false);
gameState = GAME_STATE_INIT;
}
}
If you take a look at the console output, you'll see that all of the sounds begin loading (particularly explode1.mp3) but don't necessarily finish and call assetLoaded...
UPDATE:
It seems to be the consensus is that I should not be using mp3 (incidentally, I'm already using mp3, AAC, AND ogg, but defaulting to mp3), and also that I should use the Web Audio API. These are both welcome pieces of input, and I will make the necessary changes. However, I still don't have an answer to the original question, which is, "Why does one particular sound not load reliably on desktop while the others load with no problem?" Anybody wanna take a crack at that one? Or is the answer going to be something like, "These things are highly unpredictable, and there's no way to fix it except by switching to a more dependable methodology, like Web Audio API"?
UDATE:
Here's an excerpt from my app.yaml file, which, I gather, helps GAE setup the server.
- url: /(.*\.(mp3|ogg|wav))
static_files: \1
upload: (.*\.(mp3|ogg|wav))
Some things to be aware of:
You shouldn't use MP3 for HTML5 games.
You will need to dual-encode all your sounds to both AAC (.m4a) and Ogg Vorbis (.ogg) to ensure they can be played everywhere, since there is no one format which can be played everywhere.
You must ensure your server has the correct MIME types for the audio files. Some browsers will happily play audio if the server says it has the wrong MIME type; others will fail silently. For AAC and Ogg Vorbis the types are audio/mp4 and audio/ogg respectively.
Most mobile devices can only play one sound at a time, and iOS generally doesn't let you play audio unless it's in a user-initiated input event (such as touchstart).
You'll probably want to use the Web Audio API where supported (Chrome and iOS 6+) since playback is more reliable and polyphonic even on iOS - but note iOS still mutes the Web Audio API until a user input event.
This is not a direct answer to your question why sound is not being played, but more like what you should do with your game sound effects.
For game sound effects I suggest you use HTML5 Web Audio API which gives more control over how sounds are played (pitch of the sound effect, less delay in playback, etc):
http://www.html5rocks.com/en/tutorials/webaudio/intro/
iOS 6+ supports Web Audio https://developer.apple.com/technologies/ios6/
Web audio is not supported in FF yet, but the support is coming

Html5 Audio plays only once in my Javascript code

I have a dashboard web-app that I want to play an alert sound if its having problems connecting. The site's ajax code will poll for data and throttle down its refresh rate if it can't connect. Once the server comes back up, the site will continue working.
In the mean time I would like a sound to play each time it can't connect (so I know to check the server). Here is that code. This code works.
var error_audio = new Audio("audio/"+settings.refresh.error_audio);
error_audio.load();
//this gets called when there is a connection error.
function onConnectionError() {
error_audio.play();
}
However the 2nd time through the function the audio doesn't play. Digging around in Chrome's debugger the 'played' attribute in the audio element gets set to true. Setting it to false has no results. Any ideas?
I encountered this just today, after more searching I found that you must set the source property on the audio element again to get it to restart. Don't worry, no network activity occurs, and the operation is heavily optimized.
var error_audio = new Audio("audio/"+settings.refresh.error_audio);
error_audio.load();
//this gets called when there is a connection error.
function onConnectionError() {
error_audio.src = "audio/"+settings.refresh.error_audio;
error_audio.play();
}
This behavior is expressed in chrome 21. FF doesn't seem to mind setting the src twice either!
Try setting error_audio.currentTime to 0 before playing it. Maybe it doesn't automatically go back to the beginning
You need to implement the Content-Range response headers, since Chrome requests the file in multiple parts via the Range HTTP header.
See here: HTML5 <audio> Safari live broadcast vs not
Once that has been implemented, both the play() function and setting the currentTime property should work.
Q: I’VE GOT AN AUDIOBUFFERSOURCENODE, THAT I JUST PLAYED BACK WITH NOTEON(), AND I WANT TO PLAY IT AGAIN, BUT NOTEON() DOESN’T DO ANYTHING! HELP!
A: Once a source node has finished playing back, it can’t play back more. To play back the underlying buffer again, you should create a new AudioBufferSourceNode and call noteOn().
Though re-creating the source node may feel inefficient, source nodes are heavily optimized for this pattern. Plus, if you keep a handle to the AudioBuffer, you don't need to make another request to the asset to play the same sound again. If you find yourself needing to repeat this pattern, encapsulate playback with a simple helper function like playSound(buffer).
Q: WHEN PLAYING BACK A SOUND, WHY DO YOU NEED TO MAKE A NEW SOURCE NODE EVERY TIME?
A: The idea of this architecture is to decouple audio asset from playback state. Taking a record player analogy, buffers are analogous to records and sources to play-heads. Because many applications involve multiple versions of the same buffer playing simultaneously, this pattern is essential.
source:
http://updates.html5rocks.com/2012/01/Web-Audio-FAQ
You need to pause the audio just before its end and change the current playing time to zero, then play it.
Javascript/Jquery to control HTML5 audio elements - check this link - explains How to handle/control the HTML5 audio elements?. It may help you!
Chrome/Safari have fixed this issue in newer versions of the browser and the above code now works as expected. I am not sure the precise version it was fixed in.

Categories

Resources