Use renderedBuffer as HTML5 Audio tag - javascript

So I've used WebAudioAPI to create a music from code. I've used OfflineAudioContext to create a music and it's oncomplete event is similar to this:
function(e) {
var audioCtx = new (window.AudioContext || window.webkitAudioContext)();
var song = audioCtx.createBufferSource();
song.buffer = e.renderedBuffer;
song.connect(audioCtx.destination);
song.start();
}
Which plays the sound. And it works. But I would like to instead store it as an <audio> element, because it's easier to play, loop, pause and stop, which I need to reuse the song.
Is it possible? I'm googling for days, but I can't find how!
The idea was to use var song = new Audio() and something to copy the e.renderedBuffer to it.

Ok, so I found this code floating around: http://codedbot.com/questions-/911767/web-audio-api-output . I've created a copy here too: http://pastebin.com/rE9a1PaX .
I've managed to use this code to create and store an audio on the fly, using all the function provided in this link.
offaudioctx.oncomplete = function(e) {
var buffer = e.renderedBuffer;
var UintWave = createWaveFileData(buffer);
var base64 = btoa(uint8ToString(UintWave));
songsarr.push(document.createElement('audio'))
songsarr[songsarr.length-1].src = "data:audio/wav;base64," + base64;
console.log("completed!");
};
It's not pretty, but it works. I'm leaving everything here in case someone finds an easier way.

Related

Web Audio API: createMediaStreamDestination().stream - no sound

I'm stuck with a problem in which whenever I pass the stream from createMediaStreamDestination to an audio element srcObject, no audio is being played. My implementation is based off of the response posted here Combine setSinkId with stereoPanner?
Initially, I have an audio element in which I isolate the sound so that it would only play from the left speaker
const audio = document.createElement('audio');
audio.src = audioUrl;
let audioContext = new AudioContext();
let source = audioContext.createMediaElementSource(audio);
let panner = audioContext.createStereoPanner();
let destination = audioContext.destination;
panner.pan.value = -1;
source.connect(panner).connect(destination);
The above plays sound fine when I add audio.play() but I want to be able to set specifically the speakers that the audio would play out of while keeping the panner changes. Since audioContext doesn't contain any possibility of setting the sinkId yet, I created a new audio element and mediastreamdestination and passed the mediaStream into the source object
const audio = document.createElement('audio');
audio.src = audioUrl;
let audioContext = new AudioContext();
let source = audioContext.createMediaElementSource(audio);
let panner = audioContext.createStereoPanner();
let destination = audioContext.createMediaStreamDestination();
panner.pan.value = -1;
source.connect(panner).connect(destination);
const outputAudio = new Audio();
outputAudio.srcObject = destination.stream;
outputAudio.setSinkId(audioSpeakerId);
outputAudio.play();
With the new code, however, when I start up my application, the outputAudio doesn't play any sound at all. Is there anything wrong with my code that is causing the outputAudio element not to play sound? I'm fairly new to web audio api and I tried implementing the code from the mentioned stackoverflow thread but it doesn't seem to be working for me. Any help would be appreciated!
In the description of your first code block you mention that you additionally also call audio.play() to start the audio. That's also necessary for the second code block to work. You need to start both audio elements.
Generally calling play() on an audio element and creating a new AudioContext should ideally happen in response to a user action to make sure the browser's autoplay policy doesn't block the audio.
If all goes well the state of your AudioContext should be "running".

JS Audio Not Working

One of the sound files won't play. The two following pieces of code are identical except for the file name.
This doesn't work:
var rewardSound = new Audio("audio/WrongAnswerSound.wav");
function rightAnswer(){
rewardSound.play();
}
However this works fine:
var rewardSound = new Audio("audio/CorrectAnswerSound.wav");
function rightAnswer(){
rewardSound.play();
}
The image is from the File Manager in cPanel. I can play both sounds from the File Manager itself. But, I cann't play the WrongAnswerSound.wav from the JS code. What am I doing wrong?
You kind of have the right idea.
Set a variable for the correct sound by creating a new Audio object:
var correctSound = new Audio("audio/CorrectAnswerSound.wav");
Set a variable for the wrong sound by creating another new Audio object:
var wrongSound = new Audio("audio/WrongAnswerSound.wav");
Now you both of these new objects already hold a play method that they get from the Audio object. So all you have to do to get these sounds to play is this:
correctSound.play();
wrongSound.play();

Clone Audio object

I am aware of how to clone an object, but I'm wondering, how do I clone an audio object? Should I clone it differently than I would clone an object?
To "illustrate" what I mean:
var audio = new Audio("file.mp3");
var audio2 = $.extend({}, audio); // Clones `audio`
Is this the correct way to do this?
Reason why I'm asking this is that I want to be able to play the same sound multiple times simultaneously.
I had exactly the same predicament as originally raised. The following worked perfectly well for me :
var audio2 = audio.cloneNode();
This question is ancient in javascript years. I think your code is (was) downloading the audio again and that was the cause of your delay. If you grab the audio file once and store it in a blob, you can then use that blob as the source for new audio events.
let fileBlob;
fetch("file.mp3")
.then(function(response) {return response.blob()})
.then(function(blob) {
fileBlob=URL.createObjectURL(blob);
new Audio(fileBlob); // forces a request for the blob
});
...
new Audio(fileBlob).play() // fetches the audio file from the blob.
You can also do a lot of stuff with the web audio api, but it has a slightly steeper learning curve.
#1
let audio_2 = audio_1.slice()
or
#2
let audio_2 = new Blob([audio_1])

chrome audio analyzer breaking on audio switch

I'm creating an audio visualizer with webgl, and have been integrating soundcloud tracks into it. I want to no be able to switch tracks, but I can either get my visualizer to work and the audio to break, or I can get the audio to work and the visualizer to break.
The two ways that I've been able to make it work are
Audio working
delete audio element
append new audio element to body
trigger play
Visualizer working
stop audio
change source
trigger play
When I have the visualizer working, the audio is totally messed up. The buffers just sound wrong, and the audio has artifacts in it (noise, beeps and bloops).
When I have the audio working, when I call analyser.getByteFrequencyData, I get an array of 0's. I presume this is because the analyser is not hooked up correctly.
The code for the audio working looks like
$('#music').trigger("pause");
currentTrackNum = currentTrackNum + 1;
var tracks = $("#tracks").data("tracks")
var currentTrack = tracks[parseInt(currentTrackNum)%tracks.length];
// Begin audio switching
analyser.disconnect();
$('#music').remove();
$('body').append('<audio id="music" preload="auto" src="'+ currentTrack["download"].toString() + '?client_id=4c6187aeda01c8ad86e556555621074f"></audio>');
startWebAudio(),
(I don't think I need the pause call. Do I?)
when I want the visualizer to work, I use this code
currentTrackNum = currentTrackNum + 1;
var tracks = $("#tracks").data("tracks")
var currentTrack = tracks[parseInt(currentTrackNum)%tracks.length];
// Begin audio switching
$("#music").attr("src", currentTrack["download"].toString() + "?client_id=4c6187aeda01c8ad86e556555621074f");
$("#songTitle").text(currentTrack["title"]);
$('#music').trigger("play");
The startWebAudio function looks like this.
function startWebAudio() {
// Get our <audio> element
var audio = document.getElementById('music');
// Create a new audio context (that allows us to do all the Web Audio stuff)
var audioContext = new webkitAudioContext();
// Create a new analyser
analyser = audioContext.createAnalyser();
// Create a new audio source from the <audio> element
var source = audioContext.createMediaElementSource(audio);
// Connect up the output from the audio source to the input of the analyser
source.connect(analyser);
// Connect up the audio output of the analyser to the audioContext destination i.e. the speakers (The analyser takes the output of the <audio> element and swallows it. If we want to hear the sound of the <audio> element then we need to re-route the analyser's output to the speakers)
analyser.connect(audioContext.destination);
// Get the <audio> element started
audio.play();
var freqByteData = new Uint8Array(analyser.frequencyBinCount);
}
My suspicion is that the analyzer isn't hooked up correctly, but I can't figure out what to look at to figure it out. I have looked at the frequencyByteData output, and that seems to be indicative of something not being hooked up right. The analyser variable is global. If you would like more reference to the code, here's where it is on github
You can only create a single AudioContext per window. You should also be disconnecting the MediaElementSource when you're finished using it.
Here's an example that I used to answer a similar question: http://jsbin.com/acolet/1/

Javascript Audio Player - Adding .ogg to .mp3 array

I have my javascript audio player working with .mp3s, but I'm not sure how to add a second audio format (.ogg) so the files will also play in Firefox. Can anyone help with this. Here is the array code:
var urls = new Array();
urls[0] = 'audio/song1.mp3';
urls[1] = 'audio/song2.mp3';
urls[2] = 'audio/song3.mp3';
urls[3] = 'audio/song4.mp3';
var next = 0;
The easiest way to play sounds is with SoundManager 2 with uses Flash and HTML 5 when available.

Categories

Resources