I have a web project (vanilla HTML/CSS/JS only) with three audio sources. The idea is for all three to play simultaneously, but I noticed on mobile that the files were playing out of sync (i.e. one source would start, then a few ms later the second would start, then the third). I believe they are playing due to the individual files playing as soon as they're loaded, so I would like to request that once all files have loaded that the play() method is called on all three at the same time,
What would be the best way to achieve this using vanilla JS?
Example: https://jacksorjacksor.xyz/soundblocks/
Repo: https://github.com/jacksorjacksor/jacksorjacksor/tree/master/soundblocks
TIA!
Rich
MediaElements are meant for normal playback of media and aren't optimized enough to get low latency. The best is to use the Web Audio API, and AudioBuffers.
You will first fetch each file's data in memory, then decode the audio data from these, and once all the audio data has been decoded, you'll be able to schedule playing all at the same precise moment:
(async() => {
const urls = [ "layer1_big.mp3", "layer2_big.mp3", "layer3_big.mp3" ]
.map( (url) => "https://cdn.jsdelivr.net/gh/jacksorjacksor/jacksorjacksor/soundblocks/audio/" + url );
// first, fetch each file's data
const data_buffers = await Promise.all(
urls.map( (url) => fetch( url ).then( (res) => res.arrayBuffer() ) )
);
// get our AudioContext
const context = new (window.AudioContext || window.webkitAudioContext)();
// decode the data
const audio_buffers = await Promise.all(
data_buffers.map( (buf) => context.decodeAudioData( buf ) )
);
// to enable the AudioContext we need to handle a user gesture
const btn = document.querySelector( "button" );
btn.onclick = (evt) => {
const current_time = context.currentTime;
audio_buffers.forEach( (buf) => {
// a buffer source is a really small object
// don't be afraid of creating and throwing it
const source = context.createBufferSource();
// we only connect the decoded data, it's not copied
source.buffer = buf;
// in order to make some noise
source.connect( context.destination );
// make it loop?
//source.loop = true;
// start them all 0.5s after we began, so we're sure they're in sync
source.start( current_time + 0.5 );
} );
};
btn.disabled = false;
})();
<button disabled>play</button>
Related
I want to send a audio file to a server (in my case discord) easly as if it was comming from the microphone
I found this code at Send sound through microphone in javascript and modified it to try to fit my use case, but I still cannot get it to work.
navigator.mediaDevices.getUserMedia = () => {
const audioContext = new AudioContext();
return fetch('http://127.0.0.1:8000/enemey.ogg',{mode: 'no-cors'})
.then((response) => response.arrayBuffer())
.then((arrayBuffer) => audioContext.decodeAudioData(arrayBuffer))
.then((audioBuffer) => {
const audioBufferSourceNode = audioContext.createBufferSource();
const mediaStreamAudioDestinationNode = audioContext.createMediaStreamDestination();
audioBufferSourceNode.buffer = audioBuffer;
// Maybe it makes sense to loop the buffer.
audioBufferSourceNode.loop = true;
audioBufferSourceNode.start();
audioBufferSourceNode.connect(mediaStreamAudioDestinationNode);
return mediaStreamAudioDestinationNode.stream;
});
};
any Ideas? I cannot find a fix for this, and the error is
[AudioActionCreators] unknown getUserMedia error: EncodingError
by discord
(all of this is done with the console, not a external program)
I'm using ReactJS to develop a web page (.html + .js) that will be bundled in a USB drive and shipped to customers. This USB drive contains some audio (.wav) files that are played through an HTML5 audio element in the web page. Customers will open the HTML file through their browser and listen to the songs available inside the USB drive.
I used the recent Web Audio API (specifically the analyser node) to analyze the frequency data of the current playing audio and then draw a sort of visual audio spectrum on an HTML5 canvas element.
Sadly, I was using a NodeJS local webserver during the development. Now, I prepared everything for production, just to discover that due to CORS-related restrictions my JS code can't access the audio file through the Web Audio API.
(This is because the URL protocol would be "file://", and there is no CORS policy defined for this protocol – This is the behaviour on Chrome and Firefox, using Safari it just works.)
The visual audio spectrum is an essential part of the design of this web page, and I'd hate to throw it away just because of the CORS policy.
My idea is to embed inside the JS code a JSON representation of the frequency data for the audio file, and then to use the JSON object in sync with the playing audio file to draw a fake (not in real-time) spectrum.
I tried – modifying the original code I was using to draw the spectrum – to use the JS requestAnimationFrame loop to get the frequency data for each frame and save it to a JSON file, but the JSON data seems to be incomplete and some frames (a lot) are missing.
this.audioContext = new AudioContext();
// this.props.audio is a reference to the HTML5 audio element
let src = this.audioContext.createMediaElementSource(this.props.audio);
this.analyser = this.audioContext.createAnalyser();
src.connect(this.analyser);
this.analyser.connect(this.audioContext.destination);
this.analyser.smoothingTimeConstant = 0.95;
this.analyser.fftSize = 64;
this.bufferLength = this.analyser.frequencyBinCount;
this.frequencyData = new Uint8Array(this.bufferLength);
[...]
const drawSpectrum = () => {
if (this.analyser) {
this.analyser.getByteFrequencyData(this.frequencyData);
/*
* storing this.frequencyData in a JSON file here,
* this works but I get sometimes 26 frames per seconds,
* sometimes 2 frames per seconds, never 60.
*/
}
requestAnimationFrame(drawSpectrum);
};
drawSpectrum();
Do you have a better idea to fake the visual audio spectrum? How would you go to "circumvent" the CORS-related restrictions in this case?
What could be a solid method to export audio frequency data to JSON (and then access it)?
This is one of the only cases where a data:// URL will come handy.
You can bundle your media file directly in your js or html file, as a base64 string and load it from there:
// a simple camera shutter sound
const audio_data = 'data:audio/mpeg;base64,SUQzAwAAAAAfdlRJVDIAAAABAAAAVFBFMQAAABsAAABTb3VuZEpheS5jb20gU291bmQgRWZmZWN0c1RBTEIAAAABAAAAVFlFUgAAAAEAAABUQ09OAAAAAQAAAFRSQ0sAAAABAAAAQ09NTQAAAB8AAABlbmcAb25saW5lLWF1ZGlvLWNvbnZlcnRlci5jb20AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAP/7UMAAAAAAAAAAAAAAAAAAAAAAAEluZm8AAAAPAAAAGQAAFTgAExMTHR0dHScnJycxMTExOzs7O0REREROTk5OWFhYWGJiYmJsbGxsdnZ2dn9/f3+JiYmJk5OTk52dnZ2np6ensbGxsbu7u7vExMTEzs7OztjY2Nji4uLi7Ozs7Pb29vb/////AAAAAExhdmM1OC41NAAAAAAAAAAAAAAAACQCTAAAAAAAABU4n9z9QQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAD/+1DEAAAGCBL/IYxCAU+HoZSWDBEABebsn6sAAxbiwgggHy5Cypw3CzikmzW+7x3t1O5z26fpqR5/X0uoJkCX/ptAn+UBAACwhBmaqlL0FMK512J0Su9FoXRfEDkmpGmT8OVLCJJ3JPz6qlPfuR3+Zn//yTjv9x8tMhkdzhHf9wf4j5M+v1+v/NtO2ktwP7Y/L9n/9fX7c6OqAAAa/lAbDBZhjpdDVHEtOwypFSU87nTbLo6BNsyK1cUlG2Fg+Tkc2iY+sWmDptdFm2YhS4K1//tSxBsCDpzvFSekycmek+NYl5gBk1mYQpDzl9e/ZgrLGf27vk2+2UlOL7eMtmf37Z07566kMDx76S/wBCk3/A++ra9h0kks0mZ+xCnyMADI92/62qUAEBKJucwJLAaiQDMZc6Eg6iz477ruLJnEDO2QYRkU/LXkGnEFqi3MNtjn7rVUP21mWzYXpgGUDZiuRg/Zx0LEvyR+U53lTjn2d1dR7pBx9aqH5bagP/yJaZXT/RUuE2ddT7/eARoICFtQAAgGgCgIARJq08gQ7XJ7b9v/+1LECoAMAGsfNMMAAZIxqyMUoAF1OgdOIyk0ddyJoWUikvOuc8O0HZFKDJ4TvGLGjROKAS0BuaFSZ4DMUfccQXScTUMnBgiWTXcoNqmnMnBULvT22sLr2reLpyGl9ZMooooEpgAHrfdxEBdbE4D4DmLZAJQH+zHiJAVEr5xjj8MAzEB9XowjiKLA3BS/v7ERMOC2IIVf559jLiQhERk4mE3/fbfkRU1SBDiv/232V3mqjHK41Lj43////889i500hTcRWt75qnUiIEAOWVCCMP/7UsQGAAs0eV38wwABkjFtvJGKKvzojAkDYShCHBWnFxiaoek2gYcdinx7pjv7vcJvJdaVmf7afJrz/uTNNkEIBAg/Mnwq0XOuYjswwTDxd6VzzkvTQ1+sN8Tl3zFft8ocSzy7IrGYohFSSeIFvFI+oDrRuWZER2gR6m+FiT/rfhAERIYnMvuhO55rhUajF1qqzi6a/U/Y6zm8ok4uCIAMEIjHVWRhJzO7HdN9dbHNOlHnod31m7VRNOcdkf9ERk3+QGaShMlVm9y2qEQ0ABJY//tSxASAixS5b+eE1Qlaku45rAzgOfgsEFXFgOdeLACkgLkdwS8SI4DfVdsySxol94j0/jc464NyKIQj+r+XI2S+uiIcnXQCOykA9uMEsHYaHPV/0bv1jaYs68MiRhkFqv/iz91XZDu4iK1wAtFZ5dg0c8R6jjCgMKMQksoHwqwtIhq0ysPLN29V4QcZ06BsI3nn5UtHiPw+xM/Lii+tTBuPrWLBEGQbAFDZpNnxzmfr+QHWo9hMz6f8uhW/ImEEAgAEiFgkTAbDWTD4OgbKT2z/+1LECoANFJlnpLDBSXkXq36SYAC1ocQYUkwU9SfDpVheXMv4EklH8ud8HgyCUX3+Nl5tVNdYw+DSkkVbro8EKy8SL02ldwXVu/9I/jRb+/X/xlXrLNV//67T50cj6y3sif5/y6ymVJREqKOhmIAAAKhaxgAQCEIGATA0ZITIrAREZNyNMDnA1kUH8He0P2XaNF1RBO+uUVCh6P/mt/2ZbTcv15w7X/7V/Gsl6rlHlnXSSx7NbpGx/trfFh4bahikpQZ+ipPW2uctdc0di1VZkP/7UsQFAAuwu4W4kwARdRtttx6AAMZDIhmiJE2xSCpDbxgSdstgreZu8EHPhGCGGgOKwSvIA9c9zKMCtSDyi8YT3s9fKKbN9R8fUnz57/UZbWK6B1B4P9SVGD6L2/vRpMrRT/ocvLocu22WyttOJMpgEBIJuwDX+vsaTk/wyxTyRyHYZ5FBaVMyKiGa7LXkkC5m0cKdiwtLY2p+ahfiA5pP4/+vEiQVUqzN/xzz6iorCpCk/+wuGho1jRaqkYiQMA8QBnEogv7yun2/mUpG4CiC//tSxAWAC2jFZ7iUAAF3oW1XFoAAUgAM6NYUZASqI2m0oE+yNHlUWIIaFTbFg8FDhGBZEbUH/CDGHqVay/XqI6Dl+7ji37vpz3f+eu08yos/k8cv+cg1DQZUy74HioHvZ//8ZOLn++YAAtH/YAScOpSJiMeKmSIkB0IzKLA2JqTamnDwJ4gPGFZiZDsOJGn1JDmlK/YoONFhM1wxUf/WLNDB8NThr5X//eHkYXaT/w1bf//0SUe0FULOnf8PrNCUBJbLVYq5VVZCIAAFJ20YgD3/+1LEBoAMMHVf/MMAAW8drL6YUAAGR2WREJfnpTNzJQYsnR0jd7VT3uPnzJKOaqkDk4o5wINpBV5p+hhjRRxIXLhYyLNQXQtIYwHbGE1FC8fahhcjsvgYcCY+9iqEhdiKljKFdC3xAGKarJd1VFNEAp10VhURB7Fatwej4rKTZDiWMmbZxd2FA6PLIHCjezoszKszbavRA4LO608ykURM5eiG0CYsPe1nL0/f2mYXKWIljwKHpUuq4eDwSSHTv/Sj6fybxVX//v+fzeOCMsCQQP/7UsQFgAuVg3u4koABd5etu5hgAANAnzhJFbDBQNtKIplGCYwsAwOAIDNa5A+PPO6SDRdSqJVVIudR0eQeN6LfufVgEAYTs98TIi3pWdE9/P/memrG/+c//VH67F//+//6FQe/I3bVSrKyGaAKkVQGjSIokAXHUWgjccVAjGYkiSuMDYAV+SxgVvpKmqyt36ZkXH80jlz+dbNM+Jh87HLcvPP3Nf/7P7zr6i54M9wiKPOpQ+21T0Ffyrv5IDI/3eLB1ralmZM4MzEAAABAFhQK//tSxAYACqEPYeSssUFJD+18wJrIUYbFcSw8ubqa6KCGTJlnKubutjOLuZbVJHHTu3Np8+/tE3BhCo46IWdDsymfdsruoq5HKX+0/1bR5UBgVDJK5LmpGIVUXNvjK5v3VkZUNpEEtyWg9YHsRla84bYYiUPBOho4Su2due5VzWqtRTSn00IbiCdNRpU8wHw3FozAK8qISxUIhO7/xUwMJLtVb6jzlORrmt2h931lbcWVzbdGRDIokAkrMJ4m6EsrpUJ5tjQU9Kwp2ZDldgIQUw//+1LEEAAKPPNn54zvAUcSrHzzCXjCNoDmJKPt0OUdmejrp5gFHb6KyTHmfa1mSQOn/sn/7NTHnI/+0LCUVIgAa59Dvm6/1zuOzMaIJCALDv0Ji0Np+ObAola8ZSOBaWVaJFHyxn/petTnrtnTx8o16NOEkAtClVnFN9bVoLZXWdEs976je69Z3//LVdtLkKbK0nkhvRAQ4KqMmJZoVCAALcntkmPZh7R06XTHSZGTihG+KInZxTWjX9qrAxJUqu37MarotzE2TSUqNK35jtlosv/7UsQcAAng9VvkjFLBSZcp/MKOmP+3p/2VjBnCf/7f5JdTz3dFU3lg6nOrSaiJdzQiAFJaJ32V9grDgaRQUj0LFx4SizezTa1F+t3ylVtxFELLsiqsxnLVpVKVwFZPJjux0qWcjWscD1BwDLLQaY+1cFSgINA2hyF3VnEJ/TVXmWh1VkSQklw9ezunJesI2pDdxONRPURvMwNqHyCKJon2MGd6y4pfeR1U35NAhlf3M/zEUrkq72Rpy36ou66bm6NZxIBxwWye/xj0Cr+ajBSI//tSxCkACgD1TeYMsYFAHqh8kYowdmQyAEBJKjLuKAKjhMLhpMQhsnUQF24YoVwZ7t3lUpAWDUt2XsNWceCd8Lcv+HIQNbdE0bFOr1skqb7UXX/YiwZUrkfnZs657VS5lbVqW3+shIIBTkH1Hgu7AWw5eILY4jomrWyOCza6CGBqRAy4MzEGe2ldigpYZ2g2CAwopXXu0u07fo+xos7Hf1YhiplM13vIkMC5sCqL03/ev06gjGi2ADmp34tqkE8PgpMeMtIRfMTWp+2c7T2TLXf/+1LENwAKEPE/pgxRAUMe5izBi4jd+Nc0+5MzR/mmn8NmMrbWWd1DfJ5aaxzMx72FSxRHxHDPaaGj6Zpt3or50b0iXCi3WVqVEm2jrSACcoHeX1y6qIOTR5SWBzNU6nj7mThCuuagPYetZqHGYsKtYiqttzCMdA0RUOl/pkIpbH9GU+YWtq6oitRJWSc/RzsUCgw64jYX+9+V9Ef5AAGQTJkE6flYnLKNYfCShkETVEZ2sXKKBi3Rh8cx20VtlWqvB0yhanTnCcULDwLzRDTBUf/7UsREAAo4/zemDLSJThblZMMh4NaTK2sNx0zdsdbg6GDYOCwWQ0EmItLPZvTqmf/61QQi1GQHzJ+yLuaH45PeHSw5PnofJTFaIdGXYoDdgK+yFbUHbnLVRW5aE8DxsgF5Q4e/K+1X1JRlLj1hkYNN7zSNo0ctZQlWm/Ndf////6wFalJgRguDZKRzIWw4kqvMhQExouywQNTSZVEWsyk7HqRdplHRStqjSKJFKEjVpgcmqPOY8MCJRgPrQOLlEmNdBiZcbC4jYzSv+3Z///+l//tSxE8ACdiVKWYZD0E8D+SgkyKQAVqgAZABSx4iaVFSEcXTkNyCwRRvomCyTI0JuM8vM/CCy9bY8EpqJgqJbB9Bi+D0awBB4gXoKg8FGg0KsGD5il3HRtWtn/XtX/oFV6vp7Q7JokZkusvWqBBdt0uLcoLBMEC0ARDNwdNIirUBOFektvZ4a8AywFBEoAxQRHzc4WGgcQOAo1JSUWPGka+dT//tqW1m2pUBCVVhkBOTRJD2jakvQsuwsDz6GxQAyqEHqw2FbM6ua7n7gBk1Cjn/+1LEXgAJSH0lIyTBQSgPJSSRlbA56gZ2eErlB02OQ1TjI8RAqEwSizBjBRK3Lbt+t3Ry1z8s+dTuv1AADVi8AIZ2fC1y+MJ96YzSpyh4EhCR7AWrH4mjZFtu95bSEcqVZ3FDgW1sPCnRmqzsWuwsa9DBxS5d5retejXV1d1iK9rkW9fVySoAAJOWRCr6cPT5zb1h8Kd2bqwBivjn7G1JSavR5KBDORZaSqp3hgxJCymhN8rAggsYSyJ4il1u/ku8O20uqVKf/VVyzv4ox/3BgP/7UsRxgAm0eSMmIGiBKhWkJMMI+AhpACYGA2ioQIGEiV6GwjlosMjhgTKBi8zMtXQE8zCXSVOPNqJFOLAc2UNngchBQ6qi+lN3KAjNb6y39oq5tDGU1f56pDUX1SraqgAASqpULMhkhqZjWa4x27swqGQqMqi4sJKdL90XbIj5aCRdbmpUi6fg1g/uWxy2vd0yBTay2mjLWLHsf/2M1/zrn2KRtupQMGiE4bQnJPTKbqDtehQyx8SpNJ+Sa8TUhzRf7Trv29pdsuWeUih0BrCY//tSxIMCCTCTHyYYZ8ErD6OkkY0ocFmOrSjct2hAy5W92ef2f7f9ikZn9nTVAAAZRpEgF4J2ZEgqZTfJO3Fal0aKIYWKQMRd8iiiQKgECY65p0hLEECbhI4UNQsKkXGD2Bcj0Vs3sq8t+7+3tq793WxnswAxBIcismMkzT2ZAtRLUh9hqDVfPOOz6LQEtLhVLXmwKJQmpKBjwKWrMFTDdqi+kYxE0xuNIkyRVDbmbKK8t62/hL7WHaz0NAQ9alkaTbP0Ic0iFULmU4jO2hW6+0T/+1LEloMImJUfJJhjgQ2Ro4iRmThMoocwHDDDIdPGy4jTKWb/R/Sz+t05bXNd+/71qH+xl3/Ycp6jhQICk8Hl5iS7VoSRIK6YgjIDBByJy4gewGCMxZspTn/mjcnVcx5yHUL5UZctofG+J6NG3/v2DtdIbsN5q6M7X+K+qCKd7tKeXVtZuQ/Rk/7oepkc8E/Vui+1apXyYx8bLPugNStLXmGPW0gdqNZ7Pk1L+qsZyOQ3RmdFfUiVhRGZQodh2wsp3Uv05OJln+eq7HmpHznp0v/7UsSwAgi4dR9EiGoBIIwixMKM0O0p92N5f2tbvaqmZLPPKt1y4IO1Tp9oVD8CDwHfUoNZUVaQRZcqjPXtn7/dd2fG7t4TScAjutljTFhJ5zTMiLhmilieML3o6cU8QjOfNL1D2yu1yIH2X4WPm9VjxZQ3PxSDdaNwEMKmureTdVRjuJvJB1QGBDaPwOnp6tp7R9Py895JLMgHAYkj9ajluacuSKOTjQC+dIVSdRMX6THFPZZDKt5Zr+eaGpbamsOa94BNqFIZ/nMppgoaEipa//tSxMcABvxVI0MERYFyHSGA9IxZDqVSYK0Gcl61FdCf4UWf077Rj5eumYWy72FylBdRYR3B1Ik6UTiTUclOcsAnhM4ckWjRSRqKw/ljKq81Kk214aqTGvtSY1WiYc1jF+uzqFJv1+7VYx36uxr9XP2Wl/w1/lWch/DXU1L2NVpfqTN/tVL/hqTCgrJ/skjmAqKSKoiqTEFNRTMuMTAwqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqr/+1LE2oALZYMQQzBgSUqh4lhgjn2qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqv/7UsThg4zQ6PYGGHDJlK9bxPMMuaqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq';
const button = document.getElementById( 'btn' );
const audio_ctx = new AudioContext();
// if you wish to use a MediaElementSource node:
function initMediaElementNode() {
const audio_el = new Audio();
audio_el.src = audio_data;
document.body.append( audio_el );
audio_el.controls = true;
const node = audio_ctx.createMediaElementSource( audio_el );
node.connect( audio_ctx.destination );
// to prove the data passes through the AudioContext
const analyser = audio_ctx.createAnalyser();
analyser.fftSize = 32;
node.connect( analyser );
const arr = new Uint8Array( 32 );
audio_el.onplay = (evt) => {
setTimeout( ()=> {
analyser.getByteFrequencyData( arr );
console.log( 'analyser data', [...arr] );
}, 150 );
};
}
// if you wish to use an AudioBuffer:
async function initAudioBuffer() {
const data_buf = dataURLToArrayBuffer( audio_data );
const audio_buf = await audio_ctx.decodeAudioData( data_buf );
button.onclick = (evt) => {
const source = audio_ctx.createBufferSource();
source.buffer = audio_buf;
source.connect( audio_ctx.destination );
source.start( 0 );
};
button.textContent = "play audio buffer";
}
button.onclick = (evt) => {
initMediaElementNode();
initAudioBuffer();
};
function dataURLToArrayBuffer( data_url ) {
const byte_string = atob( data_url.split( ',' )[ 1 ] );
return Uint8Array.from(
{ length: byte_string.length },
(_, i) => byte_string.charCodeAt(i)
).buffer;
}
button { vertical-align: top; }
<button id="btn">click to start</button>
I'm trying to cast a live MediaStream (Eventually from the camera) from peerA to peerB and I want peerB to receive the live stream in real time and then replay it with an added delay. Unfortunately in isn't possible to simply pause the stream and resume with play since it jump forward to the live moment.
So I have figured out that I can use MediaRecorder + SourceBuffer rewatch the live stream. Record the stream and append the buffers to MSE (SourceBuffer) and play it 5 seconds later.
This works grate on the local device (stream). But when I try to use Media Recorder on the receivers MediaStream (from pc.onaddstream) is looks like it gets some data and it's able to append the buffer to the sourceBuffer. however it dose not replay. sometime i get just one frame.
const [pc1, pc2] = localPeerConnectionLoop()
const canvasStream = canvas.captureStream(200)
videoA.srcObject = canvasStream
videoA.play()
// Note: using two MediaRecorder at the same time seem problematic
// But this one works
// stream2mediaSorce(canvasStream, videoB)
// setTimeout(videoB.play.bind(videoB), 5000)
pc1.addTransceiver(canvasStream.getTracks()[0], {
streams: [ canvasStream ]
})
pc2.onaddstream = (evt) => {
videoC.srcObject = evt.stream
videoC.play()
// Note: using two MediaRecorder at the same time seem problematic
// THIS DOSE NOT WORK
stream2mediaSorce(evt.stream, videoD)
setTimeout(() => videoD.play(), 2000)
}
/**
* Turn a MediaStream into a SourceBuffer
*
* #param {MediaStream} stream Live Stream to record
* #param {HTMLVideoElement} videoElm Video element to play the recorded video in
* #return {undefined}
*/
function stream2mediaSorce (stream, videoElm) {
const RECORDER_MIME_TYPE = 'video/webm;codecs=vp9'
const recorder = new MediaRecorder(stream, { mimeType : RECORDER_MIME_TYPE })
const mediaSource = new MediaSource()
videoElm.src = URL.createObjectURL(mediaSource)
mediaSource.onsourceopen = (e) => {
sourceBuffer = mediaSource.addSourceBuffer(RECORDER_MIME_TYPE);
const fr = new FileReader()
fr.onerror = console.log
fr.onload = ({ target }) => {
console.log(target.result)
sourceBuffer.appendBuffer(target.result)
}
recorder.ondataavailable = ({ data }) => {
console.log(data)
fr.readAsArrayBuffer(data)
}
setInterval(recorder.requestData.bind(recorder), 1000)
}
console.log('Recorder created')
recorder.start()
}
Do you know why it won't play the video?
I have created a fiddle with all the necessary code to try it out, the javascript tab is the same code as above, (the html is mostly irrelevant and dose not need to be changed)
Some try to reduce the latency, but I actually want to increase it to ~10 seconds to rewatch something you did wrong in a golf swing or something, and if possible avoid MediaRecorder altogether
EDIT:
I found something called "playout-delay" in some RTC extension
that allows the sender to control the minimum and maximum latency from capture to render time
https://webrtc.org/experiments/rtp-hdrext/playout-delay/
How can i use it?
Will it be of any help to me?
Update, there is new feature that will enable this, called playoutDelayHint.
We want to provide means for javascript applications to set their preferences on how fast they want to render audio or video data. As fast as possible might be beneficial for applications which concentrates on real time experience. For others additional data buffering may provide smother experience in case of network issues.
Refs:
https://discourse.wicg.io/t/hint-attribute-in-webrtc-to-influence-underlying-audio-video-buffering/4038
https://bugs.chromium.org/p/webrtc/issues/detail?id=10287
Demo: https://jsfiddle.net/rvekxns5/
doe i was only able to set max 10s in my browser but it's more up to the UA vendor to do it's best it can with the resources available
import('https://jimmy.warting.se/packages/dummycontent/canvas-clock.js')
.then(({AnalogClock}) => {
const {canvas} = new AnalogClock(100)
document.querySelector('canvas').replaceWith(canvas)
const [pc1, pc2] = localPeerConnectionLoop()
const canvasStream = canvas.captureStream(200)
videoA.srcObject = canvasStream
videoA.play()
pc1.addTransceiver(canvasStream.getTracks()[0], {
streams: [ canvasStream ]
})
pc2.onaddstream = (evt) => {
videoC.srcObject = evt.stream
videoC.play()
}
$dur.onchange = () => {
pc2.getReceivers()[0].playoutDelayHint = $dur.valueAsNumber
}
})
<!-- all the irrelevant part, that you don't need to know anything about -->
<h3 style="border-bottom: 1px solid">Original canvas</h3>
<canvas id="canvas" width="100" height="100"></canvas>
<script>
function localPeerConnectionLoop(cfg = {sdpSemantics: 'unified-plan'}) {
const setD = (d, a, b) => Promise.all([a.setLocalDescription(d), b.setRemoteDescription(d)]);
return [0, 1].map(() => new RTCPeerConnection(cfg)).map((pc, i, pcs) => Object.assign(pc, {
onicecandidate: e => e.candidate && pcs[i ^ 1].addIceCandidate(e.candidate),
onnegotiationneeded: async e => {
try {
await setD(await pc.createOffer(), pc, pcs[i ^ 1]);
await setD(await pcs[i ^ 1].createAnswer(), pcs[i ^ 1], pc);
} catch (e) {
console.log(e);
}
}
}));
}
</script>
<h3 style="border-bottom: 1px solid">Local peer (PC1)</h3>
<video id="videoA" muted width="100" height="100"></video>
<h3 style="border-bottom: 1px solid">Remote peer (PC2)</h3>
<video id="videoC" muted width="100" height="100"></video>
<label> Change playoutDelayHint
<input type="number" value="1" id="$dur">
</label>
Looking for experience working with media devices:
I'm working on recording on cache and playback from Microphone source; Firefox & Chrome using HTML5.
This is what I've so far:
var constraints = {audio: true, video: false};
var promise = navigator.mediaDevices.getUserMedia(constraints);
I've been checking on official documentation from MDN on getUserMedia
but nothing related to storage the audio from the constraint to cache.
No such question has been asked previously at Stackoverflow; I'm wondering if's possible.
Thanks you.
You can simply use the MediaRecorder API for such task.
In order to record only the audio from your video+audio gUM stream, you will need to create a new MediaStream, from the gUM's audioTrack:
// using async for brevity
async function doit() {
// first request both mic and camera
const gUMStream = await navigator.mediaDevices.getUserMedia({video: true, audio: true});
// create a new MediaStream with only the audioTrack
const audioStream = new MediaStream(gUMStream.getAudioTracks());
// to save recorded data
const chunks = [];
const recorder = new MediaRecorder(audioStream);
recorder.ondataavailable = e => chunks.push(e.data);
recorder.start();
// when user decides to stop
stop_btn.onclick = e => {
recorder.stop();
// kill all tracks to free the devices
gUMStream.getTracks().forEach(t => t.stop());
audioStream.getTracks().forEach(t => t.stop());
};
// export all the saved data as one Blob
recorder.onstop = e => exportMedia(new Blob(chunks));
// play current gUM stream
vid.srcObject = gUMStream;
stop_btn.disabled = false;
}
function exportMedia(blob) {
// here blob is your recorded audio file, you can do whatever you want with it
const aud = new Audio(URL.createObjectURL(blob));
aud.controls = true;
document.body.appendChild(aud);
document.body.removeChild(vid);
}
doit()
.then(e=>console.log("recording"))
.catch(e => {
console.error(e);
console.log('you may want to try from jsfiddle: https://jsfiddle.net/5s2zabb2/');
});
<video id="vid" controls autoplay></video>
<button id="stop_btn" disabled>stop</button>
And as a fiddle since stacksnippets don't work very well with gUM...
I am doing a POC and my requirement is that I want to implement the feature like OK google or Hey Siri on browser.
I am using the Chrome Browser's Web speech api. The things I noticed that I can't continuous the recognition as it terminates automatically after a certain period of time and I know its relevant because of security concern. I just does another hack like when the SpeechReognition terminates then on its end event I further start the SpeechRecogntion but it is not the best way to implement such a solution because suppose if I am using the 2 instances of same application on the different browser tab then It doesn't work or may be I am using another application in my browser that uses the speech recognition then both the application doesn't behave the same as expected. I am looking for a best approach to solve this problem.
Thanks in advance.
Since your problem is that you can't run the SpeechRecognition continuously for long periods of time, one way would be to start the SpeechRecognition only when you get some input in the mic.
This way only when there is some input, you will start the SR, looking for your magic_word.
If the magic_word is found, then you will be able to use the SR normally for your other tasks.
This can be detected by the WebAudioAPI, which is not tied by this time restriction SR suffers from. You can feed it by an LocalMediaStream from MediaDevices.getUserMedia.
For more info, on below script, you can see this answer.
Here is how you could attach it to a SpeechRecognition:
const magic_word = ##YOUR_MAGIC_WORD##;
// initialize our SpeechRecognition object
let recognition = new webkitSpeechRecognition();
recognition.lang = 'en-US';
recognition.interimResults = false;
recognition.maxAlternatives = 1;
recognition.continuous = true;
// detect the magic word
recognition.onresult = e => {
// extract all the transcripts
var transcripts = [].concat.apply([], [...e.results]
.map(res => [...res]
.map(alt => alt.transcript)
)
);
if(transcripts.some(t => t.indexOf(magic_word) > -1)){
//do something awesome, like starting your own command listeners
}
else{
// didn't understood...
}
}
// called when we detect silence
function stopSpeech(){
recognition.stop();
}
// called when we detect sound
function startSpeech(){
try{ // calling it twice will throw...
recognition.start();
}
catch(e){}
}
// request a LocalMediaStream
navigator.mediaDevices.getUserMedia({audio:true})
// add our listeners
.then(stream => detectSilence(stream, stopSpeech, startSpeech))
.catch(e => log(e.message));
function detectSilence(
stream,
onSoundEnd = _=>{},
onSoundStart = _=>{},
silence_delay = 500,
min_decibels = -80
) {
const ctx = new AudioContext();
const analyser = ctx.createAnalyser();
const streamNode = ctx.createMediaStreamSource(stream);
streamNode.connect(analyser);
analyser.minDecibels = min_decibels;
const data = new Uint8Array(analyser.frequencyBinCount); // will hold our data
let silence_start = performance.now();
let triggered = false; // trigger only once per silence event
function loop(time) {
requestAnimationFrame(loop); // we'll loop every 60th of a second to check
analyser.getByteFrequencyData(data); // get current data
if (data.some(v => v)) { // if there is data above the given db limit
if(triggered){
triggered = false;
onSoundStart();
}
silence_start = time; // set it to now
}
if (!triggered && time - silence_start > silence_delay) {
onSoundEnd();
triggered = true;
}
}
loop();
}
As a plunker, since neither StackSnippets nor jsfiddle's iframes will allow gUM in two versions...