For a side project I am using the following JS plugin to draw spectrogram of an audio file in the browser :
https://www.npmjs.com/package/spectrogram
var spectrogram = require('spectrogram');
var spectro = Spectrogram(document.getElementById('canvas'), {
audio: {
enable: false
}
});
var audioContext = new audioContext();
var request = new XMLHttpRequest();
request.open('GET', 'audio.mp3', true);
request.responseType = 'arraybuffer';
request.onload = function() {
audioContext.decodeAudioData(request.response, function(buffer) {
spectro.addSource(buffer, audioContext);
spectro.start();
});
};
request.send();
(demo available here : https://lab.miguelmota.com/spectrogram/example/)
However, the current code and examples draw the spectrogram "line by line" as the sound is currently playing through buffer.
I wonder if there's a way to actually read the file, then draw the entire spectrogram at once ? I tried converting the array buffer to blob, but the rest of the functions actually expect the sound to be buffered...
So my question is : is there a way to achieve what I'm looking for ? (Draw fulls spectrogram at once instead of buffering the audio and drawing it "on the go"
Is there an easier way to achieve what I'm looking for ?
Thank you for your time and help.
Related
I'm trying to optimize the loading times of audio files in a project where we need to use AudioBufferSourceNode. It requires audio buffer to be loaded..
but can it be possible that i can load say first 10 mins of audio first, and play it while download other part in background. And later create another source node which loads with second part of audio file.
My current implementation loads all of the audio first. Which isn't great as it takes time. My files are 60-70 MB long.
function getData() {
source = audioCtx.createBufferSource();
var request = new XMLHttpRequest();
request.open('GET', 'viper.ogg', true);
request.responseType = 'arraybuffer';
request.onload = function() {
var audioData = request.response;
audioCtx.decodeAudioData(audioData, function(buffer) {
source.buffer = buffer;
source.connect(audioCtx.destination);
source.loop = true;
},
function(e){ console.log("Error with decoding audio data" + e.err); });
}
request.send();
}
I think you can achieve what you want by using the WebCodecs API (which is currently only available in Chrome) but it requires some plumbing.
To get the file as a stream you could use fetch() instead of XMLHttpRequest.
Then you would need to demux the encoded file to get the raw audio data to decode it with an AudioDecoder. With a bit of luck it will output AudioData objects. These objects can be used to get the raw sample data which can then be used to create an AudioBuffer.
There are not many WebCodecs examples available yet. I think the example which shows how to decode an MP4 is the most similar to your use case available so far.
I am making a video editing tool, where the user loads a local video into the application and edits it. For this I have to extract audio from the local file.
Currently I am loading the video file through a XMLHttpRequest which gives a arraybuffer as output. From this arraybuffer using decodeAudioData from audioContext Object I am getting AudioBuffer, which is used to paint the canvas.
let audioContext = new (window.AudioContext || window.webkitAudioContext)();
var req = new XMLHttpRequest();
req.open('GET', this.props.videoFileURL, true);
req.responseType = 'arraybuffer';
req.onload = e => {
audioContext.decodeAudioData(
req.response,
buffer => {
this.currentBuffer = buffer;
this.props.setAudioBuffer(buffer);
requestAnimationFrame(this.updateCanvas);
},
this.onDecodeError
);
console.log(req.response);
};
req.send();
This is working for most mp4 files but I am getting decodeError when I test with MPEG-1/2 encoded video files
Edit 1 :
I understand this is a demux issue, I am not able to find demuxer for mpeg-1
I'm using XMLHttpRequest to load an audio, and Web Audio Api to get the frequency of that audio to use for a visualizer. I am new to the concept, but what I've learned is that the second parameter "url" in XMLHttpRequest's open method must be in the same domain as the current document, a relative url (e.g audio-folder/music.mp3).
I want to open an audio from a third party website outside of the database (https://c1.rbxcdn.com/36362fe5a1eab0c46a9b23cf4b54889e), but of course, returns an error.
I assume the way is to save the audio from the base url into the database so the XMLHttpRequest can be sent, and then remove it once the audio's frequency has been calculated. But how exactly can I do this? I'm not sure where to start, or how efficient this is, so if you have advice then I'd love to hear it.
Here is the code I'm working with.
function playSample() {
var request = new XMLHttpRequest();
request.open('GET', 'example.mp3', true);
request.responseType = 'arraybuffer';
// When loaded decode the data
request.onload = function() {
// decode the data
setupAudioNodes();
context.decodeAudioData(request.response, function(buffer) {
// when the audio is decoded play the sound
sourceNode.buffer = buffer;
sourceNode.start(0);
rafID = window.requestAnimationFrame(updateVisualization);
}, function(e) {
console.log(e);
});
};
request.send();
}
I have a web-server that streams wav audio and I would like to play it in a web browser using the javascript audio API.
Here is my code :
function start() {
var request = new XMLHttpRequest();
request.open("GET", url, true);
request.responseType = "arraybuffer"; // Read as binary data
request.onload = function() {
var data = request.response;
playSound(data);
};
}
The problem here is that onload wont be called until the data is completely loaded which is not very convenient for streaming.
So I looked for another event and found onProgress but the problem is that request.response returns null if the data is not completely loaded.
Whats the correct way to play an audio stream with javascript ?
Thank's for your help.
Lukas
In Google Chrome:
One .wav file is played, looping. Another .wav file is played from time to time as a sound effect.
When the sound effect plays, the volume of the looping sound automatically decreases. The volume gradually increases again over about 15 seconds.
(I guess it's automatically ducking http://en.wikipedia.org/wiki/Ducking )
I don't want the volume of the loop to decrease when the sound effect plays. How can I prevent this behaviour?
Example: http://www.matthewgatland.com/games/takedown/play/web/audiofail.html
window.AudioContext = window.AudioContext||window.webkitAudioContext;
var context = new AudioContext();
var play = function (buffer, loop) {
var source = context.createBufferSource();
source.buffer = buffer;
if (loop) source.loop = true;
source.connect(context.destination);
source.start(0);
};
var load = function (url, callback) {
var request = new XMLHttpRequest();
request.open('GET', url, true);
request.responseType = 'arraybuffer';
request.onload = function() {
context.decodeAudioData(request.response, function(buffer) {
callback(buffer);
}, null);
};
request.send();
};
var musicSound;
var thudSound;
load("res/snd/music0.wav", function (buffer) {
musicSound = buffer;
});
load("res/snd/thud0.wav", function (buffer) {
thudSound = buffer;
});
Once the sounds have loaded, call:
play(musicSound, true); //start the music looping
//each time you call this, the music becomes quiet for a few seconds
play(thudSound, false);
You might have to do some sound design before you put this into your website. I don't know what you are using for an editor but you might want to edit the sounds together so that their over all level is closer to the level of the original looping sound. That way their won't be as dramatic a difference in levels that is triggering the automatic gain reduction. The combination of both sounds is too loud so the louder of the two will bring down the level of the softer one. So if you bring them closer together in level the overall difference shouldn't be as drastic when or if the gain reduction kicks in.