How to visualize recorded audio from Blob with AudioContext? - javascript

I have successfully created an audio wave visualizer based on the mdn example here. I now want to add visualization for recorded audio as well. I record the audio using MediaRecorder and save the result as a Blob. However I cannot find a way to connect my AudioContext to the Blob.
This is the relevant code part so far:
var audioContext = new (window.AudioContext || window.webkitAudioContext)();
var analyser = audioContext.createAnalyser();
var dataArray = new Uint8Array(analyser.frequencyBinCount);
if (mediaStream instanceof Blob)
// Recorded audio - does not work
var stream = URL.createObjectURL(mediaStream);
else
// Stream from the microphone - works
stream = mediaStream;
var source = audioContext.createMediaStreamSource(stream);
source.connect(analyser);
mediaStream comes from either:
navigator.mediaDevices.getUserMedia ({
audio: this.audioConstraints,
video: this.videoConstraints,
})
.then( stream => {
mediaStream = stream;
}
or as a result of the recorded data:
mediaRecorder.addEventListener('dataavailable', event => {
mediaChunks.push(event.data);
});
...
mediaStream = new Blob(mediaChunks, { 'type' : 'video/webm' });
How do I connect the AudioContext to the recorded audio? Is it possible with a Blob? Do I need something else? What am I missing?
I've created a fiddle. The relevant part starts at line 118.
Thanks for help and suggestions.
EDIT:
Thanks to Johannes Klauß, I've found a solution.
See the updated fiddle.

You can use the Response API to create an ArrayBuffer and decode that with the audio context to create an AudioBuffer which you can connect to the analyser:
mediaRecorder.addEventListener('dataavailable', event => {
mediaChunks.push(event.data);
});
...
const arrayBuffer = await new Response(new Blob(mediaChunks, { 'type' : 'video/webm' })).arrayBuffer();
const audioBuffer = await audioContext.decodeAudioData(arrayBuffer);
const source = audioContext.createBufferSource();
source.buffer = audioBuffer;
source.connect(analyser);

Related

(Javascript) Microphone and audio from mediastream are out of sync

I wrote a recorder that records microphone from getUsermedia and audio which is from local using Howler JS.
I created mediastream destination, and
connected each sources (mic, audio) to the destination.
audio seems fine, but microphone is delayed about 2seconds.
I can't figure out the problem.
could you help me guys?
var recorder;
const stop = document.getElementsByClassName("stop");
const record = document.getElementsByClassName("record");
let mediaDest = Howler.ctx.createMediaStreamDestination();
Howler.masterGain.connect(mediaDest);
function onRecordingReady(e) {
// 'e' has 'blob event'
//var audio = document.getElementById("audio");
audioBlob = e.data; // e.data has blob.
//audio.src = URL.createObjectURL(e.data);
}
let audioBlob;
let audioURL = "";
navigator.mediaDevices.getUserMedia({ audio: true }).then(function (stream) {
let userMic = Howler.ctx.createMediaStreamSource(stream);
userMic.connect(mediaDest);
Howler.masterGain.connect(mediaDest);
recorder = new MediaRecorder(mediaDest.stream);
recorder.addEventListener("dataavailable", onRecordingReady);
recorder.addEventListener("stop", function () {
W3Module.convertWebmToMP3(audioBlob).then((mp3blob) => {
const downloadLink = document.createElement("a");
downloadLink.href = URL.createObjectURL(mp3blob);
downloadLink.setAttribute("download", "audio");
//downloadLink.click();
var audio = document.getElementById("audio");
audio.src = URL.createObjectURL(mp3blob);
console.log(mp3blob);
});
});
});
record[0].addEventListener("click", function () {
recorder.start();
});
stop[0].addEventListener("click", function () {
recorder.stop();
});
I figured out the solution.
I didn't know I could connect MediaStreamAudioSourceNode to GainNode.
If someone is suffering this issue, just connect one Node to another Node rather than connect each node to the destination.
I connected the sourceNode to the GainNode, and connected GainNode to the destination.
=========================
It was not the solution...
GainNode playback in realtime whenever input is present...so, even if i can remove the latency, annoying playback occurs.

Streaming into <audio> element

I would like to play audio from a web socket that sends packages of sound data of unknown total length. The playback should start as soon as the first package arrives and it should not be interrupted by new packages.
What I have done so far:
ws.onmessage = e => {
const soundDataBase64 = JSON.parse(e.data);
const bytes = window.atob(soundDataBase64);
const arrayBuffer = new window.ArrayBuffer(bytes.length);
const bufferView = new window.Uint8Array(arrayBuffer);
for (let i = 0; i < bytes.length; i++) {
bufferView[i] = bytes.charCodeAt(i);
}
const blob = new Blob([arrayBuffer], {"type": "audio/mp3"});
const objectURL = window.URL.createObjectURL(blob);
const audio = document.createElement("audio");
audio.src = objectURL;
audio.controls = "controls";
document.body.appendChild(audio);
};
However, to my knowledge, it is not possible to extend the size of ArrayBuffer and Uint8Array. I would have to create a new blob, object URL and assign it to the audio element. But I guess, this would interrupt the audio playback.
On the MDN page of <audio>, there is a hint to MediaStream, which looks promising. However, I am not quite sure how to write data onto a media stream and how to connect the media stream to an audio element.
Is it currently possible with JS to write something like pipe where I can input data on one end, which is then streamed to a consumer? How would seamless streaming be achieved in JS (preferably without a lot of micro management code)?
As #Kaiido pointed out in the comments, I can use the MediaSource object. After connecting a MediaSource object to an <audio> element in the DOM, I can add a SourceBuffer to an opened MediaSource object and then append ArrayBuffers to the SourceBuffer.
Example:
const ws = new window.WebSocket(url);
ws.onmessage = _ => {
console.log("Media source not ready yet... discard this package");
};
const mediaSource = new window.MediaSource();
const audio = document.createElement("audio");
audio.src = window.URL.createObjectURL(mediaSource);
audio.controls = true;
document.body.appendChild(audio);
mediaSource.onsourceopen = _ => {
const sourceBuffer = mediaSource.addSourceBuffer("audio/mpeg"); // mpeg appears to not work in Firefox, unfortunately :(
ws.onmessage = e => {
const soundDataBase64 = JSON.parse(e.data);
const bytes = window.atob(soundDataBase64);
const arrayBuffer = new window.ArrayBuffer(bytes.length);
const bufferView = new window.Uint8Array(arrayBuffer);
for (let i = 0; i < bytes.length; i++) {
bufferView[i] = bytes.charCodeAt(i);
}
sourceBuffer.appendBuffer(arrayBuffer);
};
};
I tested this successfully in Google Chrome 94. Unfortunately, in Firefox 92, the MIME type audio/mpeg seems not working. There, I get the error Uncaught DOMException: MediaSource.addSourceBuffer: Type not supported in MediaSource and the warning Cannot play media. No decoders for requested formats: audio/mpeg.

Audio recording in JavaScript on Chrome, always sends video/ogg to the server

I have been trying to record audio in OGG format on Chrome and send it back to the server, but it always gets their in video/ogg format. Here is what I have:
Capturing audio:
let chunks = [];
let recording = null;
let mediaRecorder = new MediaRecorder(stream);
mediaRecorder.start();
mediaRecorder.onstop = function() {
recording = new Blob(chunks, { 'type' : 'audio/ogg; codecs=opus' });
}
mediaRecorder.ondataavailable = function(e){
chunks.push(e.data);
}
Sending it to the server:
let data = new FormData();
data.append('audio', recording);
jQuery.ajax(...);
The blob gets to the backend, but always in video/ogg!
I ended up using kbumsik/opus-media-recorder, solved the issue for me. A drop-in replacement for MediaRecorder.
You need to remove the VideoTrack from your MediaStream:
const input = document.querySelector("video");
const stop_btn = document.querySelector("button");
input.onplaying = (evt) => {
input.onplaying = null;
console.clear();
const stream = input.captureStream ? input.captureStream() : input.mozCaptureStream();
// get all video tracks (usually a single one)
stream.getVideoTracks().forEach( (track) => {
track.stop(); // stop that track, so the browser doesn't feed it for nothing
stream.removeTrack( track ); // remove it from the MediaStream
} );
const data = [];
const recorder = new MediaRecorder( stream, { mimeType: "audio/webm" } );
recorder.ondataavailable = (evt) => data.push( evt.data );
recorder.onstop = (evt) => exportFile( new Blob( data ) );
stop_btn.onclick = (evt) => recorder.stop();
stop_btn.disabled = false;
recorder.start();
};
console.log( "play the video to start recording" );
function exportFile( blob ) {
stop_btn.remove();
input.src = URL.createObjectURL( blob );
console.log( "video element now playing recoded file" );
}
video { max-height: 150px; }
<video src="https://upload.wikimedia.org/wikipedia/commons/2/22/Volcano_Lava_Sample.webm" controls crossorigin></video>
<button disabled>stop recording</button>
And since StackOverflow's null origined iframes don't allow for safe download links, here is a fiddle with a download link.
You need to set the mimeType of the MediaRecorder. Otherwise the browser will pick whatever format it likes best to encode the media.
let mediaRecorder = new MediaRecorder(stream, { mimeType: 'my/mimetype' });
To be sure that the browser can actually encode the format you want you could use isTypeSupported().
console.log(MediaRecorder.isTypeSupported('my/mimetype'));
Chrome for example doesn't support "audio/ogg; codecs=opus" but supports "audio/webm; codecs=opus". Firefox supports both. Safari none of them.
Once you've configured the MediaRecorder you can use its mimeType when creating the blob.
recording = new Blob(chunks, { 'type' : mediaRecorder.mimeType });

Sound wont play correctly with Web Audio

I am trying to play a wav file using AudioContext - it plays correctly when loaded with <audio> tag (as shown in jsfiddle), but plays incorrectly when using AudioContext.
var startButton = document.getElementById('start-stream');
var wav = new wavefile.WaveFile();
startButton.onclick = function() {
audioCtx = new AudioContext();
wav.fromBase64(mydata);
buffer = audioCtx.createBuffer(1, audioCtx.sampleRate * 3, audioCtx.sampleRate);
// add audio data to buffer
buffer.getChannelData(0).set(wav.getSamples());
source = audioCtx.createBufferSource();
source.buffer = buffer;
source.connect(audioCtx.destination);
source.start();
};
Fiddle is here: https://jsfiddle.net/Persiancoffee/6v8dLt3f/7/
The decodeAudioData() function of the Web Audio API can decode WAV files which is why you don't need any external libraries for this use case. It will produce an AudioBuffer for you.
startButton.onclick = async function () {
audioCtx = new AudioContext();
const arrayBuffer = Uint8Array.from(
atob(mydata),
(character) => character.charCodeAt(0)
).buffer;
buffer = await audioCtx.decodeAudioData(arrayBuffer);
source = audioCtx.createBufferSource();
source.buffer = buffer;
source.connect(audioCtx.destination);
source.start();
};
Here is a link to an updated version of your fiddle: https://jsfiddle.net/pzx0vg89/.

What's the best way to get an audio buffer into a blob that can be played by an audio element?

I have an AudioBuffer stored as a variable, and I would like to have it be played by an Audio element. Here is my current non-functioning code:
const blob = new Blob(audioBuffer.getChannelData(1), { type: "audio/wav" });
const url = window.URL.createObjectURL(blob);
audioElement.src = url;
When I try to play audioElement, I get the following error:
Uncaught (in promise) DOMException: The element has no supported sources.
Does anyone have any ideas on how to solve this? Thanks in advance!
AudioBuffer is PCM data, not encoded as WAV yet. If you need WAV you should get a library to do the encoding for you, such as https://www.npmjs.com/package/audiobuffer-to-wav
After including above code (you can just copy the audioBufferToWav function and the functions it calls below it out of index.js).
const blob = new Blob([audioBufferToWav(audioBuffer.getChannelData(1))], { type: "audio/wav" });
const url = window.URL.createObjectURL(blob);
audioElement.src = url;
Below using Web Audio API to playback the PCM AudioBuffer directly.
var myArrayBuffer = audioBuffer;
var audioCtx = new (window.AudioContext || window.webkitAudioContext)();
var source = audioCtx.createBufferSource();
source.buffer = myArrayBuffer;
source.connect(audioCtx.destination);
source.start();

Categories

Resources