WAAPISim Polyfill - how to load .mp3 with .wav fallback - javascript

I'm using the WAAPISim Polyfill for cross-browser support in a visualization from an audio file, using the Web Audio API. The polyfill attempts to use the following methods in this order: "WebAudioAPI => AudioDataAPI => Flash". I am loading the audio file like this in JS:
// load the specified sound
function loadSound(url) {
var request = new XMLHttpRequest();
request.open('GET', url, true);
request.responseType = 'arraybuffer';
// When loaded decode the data
request.onload = function() {
// decode the data
context.decodeAudioData(request.response, function(buffer) {
// when the audio is decoded play the sound
playSound(buffer);
}, onError);
}
request.send();
}
loadSound("audio/bird.wav");
As noted in the polyfill's documentation, this polyfill only supports wav format for this method. "createBuffer from ArrayBuffer and decodeAudioData supports only wav format."
Right now, it is only loading a .wav, but I'd like to load a .mp3 (smaller file) instead for browsers that will support it. How can I detect whether the implementation will work with a .mp3 and load the right file accordingly?
Full demo example

If:
(new Audio()).canPlayType("audio/mp3")
returns "maybe" or "probably", then the browser supports mp3 in decodeAudioData.

I got this response from the polyfill developer:
if (typeof window.waapisimContexts != 'undefined'){
loadSound("audio/bird.wav");
} else {
loadSound("audio/bird.mp3");
}
The polyfill creates waapisimContexts only if the browser requires it to play a .wav, so this approach looks for the waapisimContexts and loads the .wav if it is defined. Otherwise, it loads the .mp3.

Related

Can it be possible to fetch parts of audio file and load in AudioBufferSourceNode?

I'm trying to optimize the loading times of audio files in a project where we need to use AudioBufferSourceNode. It requires audio buffer to be loaded..
but can it be possible that i can load say first 10 mins of audio first, and play it while download other part in background. And later create another source node which loads with second part of audio file.
My current implementation loads all of the audio first. Which isn't great as it takes time. My files are 60-70 MB long.
function getData() {
source = audioCtx.createBufferSource();
var request = new XMLHttpRequest();
request.open('GET', 'viper.ogg', true);
request.responseType = 'arraybuffer';
request.onload = function() {
var audioData = request.response;
audioCtx.decodeAudioData(audioData, function(buffer) {
source.buffer = buffer;
source.connect(audioCtx.destination);
source.loop = true;
},
function(e){ console.log("Error with decoding audio data" + e.err); });
}
request.send();
}
I think you can achieve what you want by using the WebCodecs API (which is currently only available in Chrome) but it requires some plumbing.
To get the file as a stream you could use fetch() instead of XMLHttpRequest.
Then you would need to demux the encoded file to get the raw audio data to decode it with an AudioDecoder. With a bit of luck it will output AudioData objects. These objects can be used to get the raw sample data which can then be used to create an AudioBuffer.
There are not many WebCodecs examples available yet. I think the example which shows how to decode an MP4 is the most similar to your use case available so far.

Opening a Base Url with XMLHttpRequest?

I'm using XMLHttpRequest to load an audio, and Web Audio Api to get the frequency of that audio to use for a visualizer. I am new to the concept, but what I've learned is that the second parameter "url" in XMLHttpRequest's open method must be in the same domain as the current document, a relative url (e.g audio-folder/music.mp3).
I want to open an audio from a third party website outside of the database (https://c1.rbxcdn.com/36362fe5a1eab0c46a9b23cf4b54889e), but of course, returns an error.
I assume the way is to save the audio from the base url into the database so the XMLHttpRequest can be sent, and then remove it once the audio's frequency has been calculated. But how exactly can I do this? I'm not sure where to start, or how efficient this is, so if you have advice then I'd love to hear it.
Here is the code I'm working with.
function playSample() {
var request = new XMLHttpRequest();
request.open('GET', 'example.mp3', true);
request.responseType = 'arraybuffer';
// When loaded decode the data
request.onload = function() {
// decode the data
setupAudioNodes();
context.decodeAudioData(request.response, function(buffer) {
// when the audio is decoded play the sound
sourceNode.buffer = buffer;
sourceNode.start(0);
rafID = window.requestAnimationFrame(updateVisualization);
}, function(e) {
console.log(e);
});
};
request.send();
}

Safari 9 XMLHttpRequest Blob file download

Hello JavaScript gurus,
I need a file download functionality using XMLHttpRequest (with responseType="blob") that works in Safari 9+.
At the moment I'm using FileSaver.js like this:
var xhr = new XMLHttpRequest();
xhr.open('GET', url, true);
xhr.responseType = 'blob';
xhr.onreadystatechange = function() {
if (xhr.readyState == 4) {
// using FileSaver.js to save blob
saveAs(xhr.response, filename);
// notify download finished, resolve promise
defer.resolve(true);
}
};
xhr.send();
which works fine in all main browsers but not in current version (9.x) of Safari.
I'll get a "Failed to load resource: Frame load interrupted". Usually a download is a zip file but I also tried to set "application/octet-stream".
I have one requirement: I need to know when then download has finished on client-side so using an iframe is no option (I guess).
I'm thankful for any hint how to download a file in Safari using XHR (no Flash).
Thanks,
Chris
Simple answer:
There is no solution!
See also: https://forums.developer.apple.com/message/119222
Thanks Safari ... my new almost IE6

Play audio stream with javascript API

I have a web-server that streams wav audio and I would like to play it in a web browser using the javascript audio API.
Here is my code :
function start() {
var request = new XMLHttpRequest();
request.open("GET", url, true);
request.responseType = "arraybuffer"; // Read as binary data
request.onload = function() {
var data = request.response;
playSound(data);
};
}
The problem here is that onload wont be called until the data is completely loaded which is not very convenient for streaming.
So I looked for another event and found onProgress but the problem is that request.response returns null if the data is not completely loaded.
Whats the correct way to play an audio stream with javascript ?
Thank's for your help.
Lukas

Fetching BLOB in Chrome Android

I'm struggling to fetch an HTML5 video using xhr2 and blob responseType with Chrome on Android 4.2. The code works perfectly on Chrome and Firefox desktop and on Firefox Android 4.2 (with FF desktop, I use a webm video instead of the mp4).
// Taking care of prefix
window.URL = window.URL || window.webkitURL;
// This function download the video
var loadVideo = function() {
var xhr = new XMLHttpRequest();
xhr.addEventListener('load', addVideoFile, false);
xhr.open('GET', "videos/myvideo.mp4" , true);
xhr.responseType = 'blob';
xhr.send();
};
// this function sets the video source
var addVideoFile = function() {
if(4 == this.readyState && 200 == this.status) {
var video = document.getElementById('vid'),
blob = this.response;
video.src = window.URL.createObjectURL(blob);
console.log('video ready');
}
};
loadVideo();
Can anyone explain me why this does not work with Chrome on Android? If I plug my phone to use the remote debugging, the console will display 'video ready', suggesting that the video was downloaded but it's impossible to play it, video is just a black screen.
Also, this code works if I use it to fetch images instead of video. Is there a limitation I'm not aware of, preventing to download Blob above a certain size? (My video is 1.5 MB).
Thanks you very much for your help!
This is most certainly a bug. If you get something that works on Desktop Chrome but not Android then 99.5% of the time it is an issue we need to fix.
I have replicated your issue http://jsbin.com/uyehun/1 and I have filed the bug too https://code.google.com/p/chromium/issues/detail?id=253465
Per http://caniuse.com/bloburls, for Android 4.0-4.3 you need to use window.webkitURL.createObjectUrl() instead of window.URL.createObjectUrl().
This will let you generate a blob url, though I haven't actually been able to get a video element to play such an url.

Categories

Resources