HTML 5 Audio lib - How to get the average value of the music table - javascript

I have found there a nice and working perfectly fine tutorial to play with music with a web library.
Everything is working fine but I would like to have the "average value" of the music I play. Let me explain that. Currently: music is loaded,
function loadSound(url) {
var request = new XMLHttpRequest();
request.open('GET', url, true);
request.responseType = 'arraybuffer';
// When loaded decode the data
request.onload = function() {
// decode the data
context.decodeAudioData(request.response, function(buffer) {
// when the audio is decoded play the sound
playSound(buffer);
}, onError);
}
request.send();
}
and buffer played. I can easily find the average value of the buffer that way:
var getAverage = function(dataArray){
var total = 0, // initialize to 0
i = 0, length = dataArray.length;
while(i < length) total += dataArray[i++]; // add all
return length ? total / length : 0; // divide (when length !== 0)
}
used there
javascriptNode.onaudioprocess = function() {
// get the average for the first channel
var array = new Uint8Array(analyser.frequencyBinCount);
analyser.getByteFrequencyData(array);
// clear the current state
ctx.clearRect(0, 0, 1000, 325);
// set the fill style
ctx.fillStyle=gradient;
drawSpectrum(array);
var averageVal = getAverage(array);
But what about the "global" value and not only the one put in the buffer?
In a word: how to recover the global Uint8Array object from the loaded mp3, and not only the parsed buffer?
Many thanks for you help!

To recap: You have a program which streams an MP3 file. Periodically you are called back with consecutive chunks of the stream and you are able to compute an average for each individual chunk. You would like to compute the average for the entire stream.
The only way to compute the average is to maintain a running sum over the entire stream and a count of the number of elements to divide it by. I don't know javascript but you would need to store these two values in variables outside of the scope of the function.
var getSum = function(dataArray){
var sum = 0, // initialize to 0
i = 0, length = dataArray.length;
while(i < length) sum += dataArray[i++]; // add all
return sum;
}
var sum = 0;
var N = 0;
javascriptNode.onaudioprocess = function() {
...
sum += getSum(array);
N += array.length;
}
Later when the stream finishes you need to compute average=sum/N.
An issue you might encounter is that sum can overflow depending on the overall capacity of the sum variable, the size of the samples, and the length of the stream. So first do what is necessary to make sum a double. If that is not enough you might have to settle for an approximation like taking an average of averages. It will be pretty close so long as each chunk is close to the same size.
var getSum = function(dataArray){
var sum = 0, // initialize to 0
i = 0, length = dataArray.length;
while(i < length) sum += dataArray[i++]; // add all
return sum;
}
var sumOfAverages = 0;
var N = 0;
javascriptNode.onaudioprocess = function() {
...
sum = getSum(array);
sumOfAverages = sum / array.length;
N += 1;
}

Related

web audio analyser's getFloatTimeDomainData buffer offset wrt buffers at other times and wrt buffer of 'complete file'

(question rewritten integrating bits of information from answers, plus making it more concise.)
I use analyser=audioContext.createAnalyser() in order to process audio data, and I'm trying to understand the details better.
I choose an fftSize, say 2048, then I create an array buffer of 2048 floats with Float32Array, and then, in an animation loop
(called 60 times per second on most machines, via window.requestAnimationFrame), I do
analyser.getFloatTimeDomainData(buffer);
which will fill my buffer with 2048 floating point sample data points.
When the handler is called the next time, 1/60 second has passed. To calculate how much that is in units of samples,
we have to divide it by the duration of 1 sample, and get (1/60)/(1/44100) = 735.
So the next handler call takes place (on average) 735 samples later.
So there is overlap between subsequent buffers, like this:
We know from the spec (search for 'render quantum') that everything happens in "chunck sizes" which are multiples of 128.
So (in terms of audio processing), one would expect that the next handler call will usually be either 5*128 = 640 samples later,
or else 6*128 = 768 samples later - those being the multiples of 128 closest to 735 samples = (1/60) second.
Calling this amount "Δ-samples", how do I find out what it is (during each handler call), 640 or 768 or something else?
Reliably, like this:
Consider the 'old buffer' (from previous handler call). If you delete "Δ-samples" many samples at the beginning, copy the remainder, and then append "Δ-samples" many new samples, that should be the current buffer. And indeed, I tried that,
and that is the case. It turns out "Δ-samples" often is 384, 512, 896. It is trivial but time consuming to determine
"Δ-samples" in a loop.
I would like to compute "Δ-samples" without performing that loop.
One would think the following would work:
(audioContext.currentTime() - (result of audioContext.currentTime() during last time handler ran))/(duration of 1 sample)
I tried that (see code below where I also "stich together" the various buffers, trying to reconstruct the original buffer),
and - surprise - it works about 99.9% of the time in Chrome, and about 95% of the time in Firefox.
I also tried audioContent.getOutputTimestamp().contextTime, which does not work in Chrome, and works 9?% in Firefox.
Is there any way to find "Δ-samples" (without looking at the buffers), which works reliably?
Second question, the "reconstructed" buffer (all the buffers from callbacks stitched together), and the original sound buffer
are not exactly the same, there is some (small, but noticable, more than usual "rounding error") difference, and that is bigger in Firefox.
Where does that come from? - You know, as I understand the spec, those should be the same.
var soundFile = 'https://mathheadinclouds.github.io/audio/sounds/la.mp3';
var audioContext = null;
var isPlaying = false;
var sourceNode = null;
var analyser = null;
var theBuffer = null;
var reconstructedBuffer = null;
var soundRequest = null;
var loopCounter = -1;
var FFT_SIZE = 2048;
var rafID = null;
var buffers = [];
var timesSamples = [];
var timeSampleDiffs = [];
var leadingWaste = 0;
window.addEventListener('load', function() {
soundRequest = new XMLHttpRequest();
soundRequest.open("GET", soundFile, true);
soundRequest.responseType = "arraybuffer";
//soundRequest.onload = function(evt) {}
soundRequest.send();
var btn = document.createElement('button');
btn.textContent = 'go';
btn.addEventListener('click', function(evt) {
goButtonClick(this, evt)
});
document.body.appendChild(btn);
});
function goButtonClick(elt, evt) {
initAudioContext(togglePlayback);
elt.parentElement.removeChild(elt);
}
function initAudioContext(callback) {
audioContext = new AudioContext();
audioContext.decodeAudioData(soundRequest.response, function(buffer) {
theBuffer = buffer;
callback();
});
}
function createAnalyser() {
analyser = audioContext.createAnalyser();
analyser.fftSize = FFT_SIZE;
}
function startWithSourceNode() {
sourceNode.connect(analyser);
analyser.connect(audioContext.destination);
sourceNode.start(0);
isPlaying = true;
sourceNode.addEventListener('ended', function(evt) {
sourceNode = null;
analyser = null;
isPlaying = false;
loopCounter = -1;
window.cancelAnimationFrame(rafID);
console.log('buffer length', theBuffer.length);
console.log('reconstructedBuffer length', reconstructedBuffer.length);
console.log('audio callback called counter', buffers.length);
console.log('root mean square error', Math.sqrt(checkResult() / theBuffer.length));
console.log('lengths of time between requestAnimationFrame callbacks, measured in audio samples:');
console.log(timeSampleDiffs);
console.log(
timeSampleDiffs.filter(function(val) {
return val === 384
}).length,
timeSampleDiffs.filter(function(val) {
return val === 512
}).length,
timeSampleDiffs.filter(function(val) {
return val === 640
}).length,
timeSampleDiffs.filter(function(val) {
return val === 768
}).length,
timeSampleDiffs.filter(function(val) {
return val === 896
}).length,
'*',
timeSampleDiffs.filter(function(val) {
return val > 896
}).length,
timeSampleDiffs.filter(function(val) {
return val < 384
}).length
);
console.log(
timeSampleDiffs.filter(function(val) {
return val === 384
}).length +
timeSampleDiffs.filter(function(val) {
return val === 512
}).length +
timeSampleDiffs.filter(function(val) {
return val === 640
}).length +
timeSampleDiffs.filter(function(val) {
return val === 768
}).length +
timeSampleDiffs.filter(function(val) {
return val === 896
}).length
)
});
myAudioCallback();
}
function togglePlayback() {
sourceNode = audioContext.createBufferSource();
sourceNode.buffer = theBuffer;
createAnalyser();
startWithSourceNode();
}
function myAudioCallback(time) {
++loopCounter;
if (!buffers[loopCounter]) {
buffers[loopCounter] = new Float32Array(FFT_SIZE);
}
var buf = buffers[loopCounter];
analyser.getFloatTimeDomainData(buf);
var now = audioContext.currentTime;
var nowSamp = Math.round(audioContext.sampleRate * now);
timesSamples[loopCounter] = nowSamp;
var j, sampDiff;
if (loopCounter === 0) {
console.log('start sample: ', nowSamp);
reconstructedBuffer = new Float32Array(theBuffer.length + FFT_SIZE + nowSamp);
leadingWaste = nowSamp;
for (j = 0; j < FFT_SIZE; j++) {
reconstructedBuffer[nowSamp + j] = buf[j];
}
} else {
sampDiff = nowSamp - timesSamples[loopCounter - 1];
timeSampleDiffs.push(sampDiff);
var expectedEqual = FFT_SIZE - sampDiff;
for (j = 0; j < expectedEqual; j++) {
if (reconstructedBuffer[nowSamp + j] !== buf[j]) {
console.error('unexpected error', loopCounter, j);
// debugger;
}
}
for (j = expectedEqual; j < FFT_SIZE; j++) {
reconstructedBuffer[nowSamp + j] = buf[j];
}
//console.log(loopCounter, nowSamp, sampDiff);
}
rafID = window.requestAnimationFrame(myAudioCallback);
}
function checkResult() {
var ch0 = theBuffer.getChannelData(0);
var ch1 = theBuffer.getChannelData(1);
var sum = 0;
var idxDelta = leadingWaste + FFT_SIZE;
for (var i = 0; i < theBuffer.length; i++) {
var samp0 = ch0[i];
var samp1 = ch1[i];
var samp = (samp0 + samp1) / 2;
var check = reconstructedBuffer[i + idxDelta];
var diff = samp - check;
var sqDiff = diff * diff;
sum += sqDiff;
}
return sum;
}
In above snippet, I do the following. I load with XMLHttpRequest a 1 second mp3 audio file from my github.io page (I sing 'la' for 1 second). After it has loaded, a button is shown, saying 'go', and after pressing that, the audio is played back by putting it into a bufferSource node and then doing .start on that. the bufferSource is the fed to our analyser, et cetera
related question
I also have the snippet code on my github.io page - makes reading the console easier.
I think the AnalyserNode is not what you want in this situation. You want to grab the data and keep it synchronized with raf. Use a ScriptProcessorNode or AudioWorkletNode to grab the data. Then you'll get all the data as it comes. No problems with overlap, or missing data or anything.
Note also that the clocks for raf and audio may be different and hence things may drift over time. You'll have to compensate for that yourself if you need to.
Unfortunately there is no way to find out the exact point in time at which the data returned by an AnalyserNode was captured. But you might be on the right track with your current approach.
All the values returned by the AnalyserNode are based on the "current-time-domain-data". This is basically the internal buffer of the AnalyserNode at a certain point in time. Since the Web Audio API has a fixed render quantum of 128 samples I would expect this buffer to evolve in steps of 128 samples as well. But currentTime usually evolves in steps of 128 samples already.
Furthermore the AnalyserNode has a smoothingTimeConstant property. It is responsible for "blurring" the returned values. The default value is 0.8. For your use case you probably want to set this to 0.
EDIT: As Raymond Toy pointed out in the comments the smoothingtimeconstant only has an effect on the frequency data. Since the question is about getFloatTimeDomainData() it will have no effect on the returned values.
I hope this helps but I think it would be easier to get all the samples of your audio signal by using an AudioWorklet. It would definitely be more reliable.
I'm not really following your math, so I can't tell exactly what you had wrong, but you seem to look at this in a too complicated manner.
The fftSize doesn't really matter here, what you want to calculate is how many samples have been passed since the last frame.
To calculate this, you just need to
Measure the time elapsed from last frame.
Divide this time by the time of a single frame.
The time of a single frame, is simply 1 / context.sampleRate.
So really all you need is currentTime - previousTime * ( 1 / sampleRate) and you'll find the index in the last frame where the data starts being repeated in the new one.
And only then, if you want the index in the new frame you'd subtract this index from the fftSize.
Now for why you sometimes have gaps, it's because AudioContext.prototype.currentTime returns the timestamp of the beginning of the next block to be passed to the graph.
The one we want here is AudioContext.prototype.getOuputTimestamp().contextTime which represents the timestamp of now, on the same same base as currentTime (i.e the creation of the context).
(function loop(){requestAnimationFrame(loop);})();
(async()=>{
const ctx = new AudioContext();
const buf = await fetch("https://upload.wikimedia.org/wikipedia/en/d/d3/Beach_Boys_-_Good_Vibrations.ogg").then(r=>r.arrayBuffer());
const aud_buf = await ctx.decodeAudioData(buf);
const source = ctx.createBufferSource();
source.buffer = aud_buf;
source.loop = true;
const analyser = ctx.createAnalyser();
const fftSize = analyser.fftSize = 2048;
source.loop = true;
source.connect( analyser );
source.start(0);
// for debugging we use two different buffers
const arr1 = new Float32Array( fftSize );
const arr2 = new Float32Array( fftSize );
const single_sample_dur = (1 / ctx.sampleRate);
console.log( 'single sample duration (ms)', single_sample_dur * 1000);
onclick = e => {
if( ctx.state === "suspended" ) {
ctx.resume();
return console.log( 'starting context, please try again' );
}
console.log( '-------------' );
requestAnimationFrame( () => {
// first frame
const time1 = ctx.getOutputTimestamp().contextTime;
analyser.getFloatTimeDomainData( arr1 );
requestAnimationFrame( () => {
// second frame
const time2 = ctx.getOutputTimestamp().contextTime;
analyser.getFloatTimeDomainData( arr2 );
const elapsed_time = time2 - time1;
console.log( 'elapsed time between two frame (ms)', elapsed_time * 1000 );
const calculated_index = fftSize - Math.round( elapsed_time / single_sample_dur );
console.log( 'calculated index of new data', calculated_index );
// for debugging we can just search for the first index where the data repeats
const real_time = fftSize - arr1.indexOf( arr2[ 0 ] );
console.log( 'real index', real_time > fftSize ? 0 : real_time );
if( calculated_index !== real_time > fftSize ? 0 : real_time ) {
console.error( 'different' );
}
});
});
};
document.body.classList.add('ready');
})().catch( console.error );
body:not(.ready) pre { display: none; }
<pre>click to record two new frames</pre>

How do I check if 2 numbers are the same from Math.random [duplicate]

Can't seem to find an answer to this, say I have this:
setInterval(function() {
m = Math.floor(Math.random()*7);
$('.foo:nth-of-type('+m+')').fadeIn(300);
}, 300);
How do I make it so that random number doesn't repeat itself. For example if the random number is 2, I don't want 2 to come out again.
There are a number of ways you could achieve this.
Solution A:
If the range of numbers isn't large (let's say less than 10), you could just keep track of the numbers you've already generated. Then if you generate a duplicate, discard it and generate another number.
Solution B:
Pre-generate the random numbers, store them into an array and then go through the array. You could accomplish this by taking the numbers 1,2,...,n and then shuffle them.
shuffle = function(o) {
for(var j, x, i = o.length; i; j = parseInt(Math.random() * i), x = o[--i], o[i] = o[j], o[j] = x);
return o;
};
var randorder = shuffle([0,1,2,3,4,5,6]);
var index = 0;
setInterval(function() {
$('.foo:nth-of-type('+(randorder[index++])+')').fadeIn(300);
}, 300);
Solution C:
Keep track of the numbers available in an array. Randomly pick a number. Remove number from said array.
var randnums = [0,1,2,3,4,5,6];
setInterval(function() {
var m = Math.floor(Math.random()*randnums.length);
$('.foo:nth-of-type('+(randnums[m])+')').fadeIn(300);
randnums = randnums.splice(m,1);
}, 300);
You seem to want a non-repeating random number from 0 to 6, so similar to tskuzzy's answer:
var getRand = (function() {
var nums = [0,1,2,3,4,5,6];
var current = [];
function rand(n) {
return (Math.random() * n)|0;
}
return function() {
if (!current.length) current = nums.slice();
return current.splice(rand(current.length), 1);
}
}());
It will return the numbers 0 to 6 in random order. When each has been drawn once, it will start again.
could you try that,
setInterval(function() {
m = Math.floor(Math.random()*7);
$('.foo:nth-of-type(' + m + ')').fadeIn(300);
}, 300);
I like Neal's answer although this is begging for some recursion. Here it is in java, you'll still get the general idea. Note that you'll hit an infinite loop if you pull out more numbers than MAX, I could have fixed that but left it as is for clarity.
edit: saw neal added a while loop so that works great.
public class RandCheck {
private List<Integer> numbers;
private Random rand;
private int MAX = 100;
public RandCheck(){
numbers = new ArrayList<Integer>();
rand = new Random();
}
public int getRandomNum(){
return getRandomNumRecursive(getRand());
}
private int getRandomNumRecursive(int num){
if(numbers.contains(num)){
return getRandomNumRecursive(getRand());
} else {
return num;
}
}
private int getRand(){
return rand.nextInt(MAX);
}
public static void main(String[] args){
RandCheck randCheck = new RandCheck();
for(int i = 0; i < 100; i++){
System.out.println(randCheck.getRandomNum());
}
}
}
Generally my approach is to make an array containing all of the possible values and to:
Pick a random number <= the size of the array
Remove the chosen element from the array
Repeat steps 1-2 until the array is empty
The resulting set of numbers will contain all of your indices without repetition.
Even better, maybe something like this:
var numArray = [0,1,2,3,4,5,6];
numArray.shuffle();
Then just go through the items because shuffle will have randomized them and pop them off one at a time.
Here's a simple fix, if a little rudimentary:
if(nextNum == lastNum){
if (nextNum == 0){nextNum = 7;}
else {nextNum = nextNum-1;}
}
If the next number is the same as the last simply minus 1 unless the number is 0 (zero) and set it to any other number within your set (I chose 7, the highest index).
I used this method within the cycle function because the only stipulation on selecting a number was that is musn't be the same as the last one.
Not the most elegant or technically gifted solution, but it works :)
Use sets. They were introduced to the specification in ES6. A set is a data structure that represents a collection of unique values, so it cannot include any duplicate values. I needed 6 random, non-repeatable numbers ranging from 1-49. I started with creating a longer set with around 30 digits (if the values repeat the set will have less elements), converted the set to array and then sliced it's first 6 elements. Easy peasy. Set.length is by default undefined and it's useless that's why it's easier to convert it to an array if you need specific length.
let randomSet = new Set();
for (let index = 0; index < 30; index++) {
randomSet.add(Math.floor(Math.random() * 49) + 1)
};
let randomSetToArray = Array.from(randomSet).slice(0,6);
console.log(randomSet);
console.log(randomSetToArray);
An easy way to generate a list of different numbers, no matter the size or number:
function randomNumber(max) {
return Math.floor(Math.random() * max + 1);
}
const list = []
while(list.length < 10 ){
let nbr = randomNumber(500)
if(!list.find(el => el === nbr)) list.push(nbr)
}
console.log("list",list)
I would like to add--
var RecordKeeper = {};
SRandom = function () {
currTimeStamp = new Date().getTime();
if (RecordKeeper.hasOwnProperty(currTimeStamp)) {
RecordKeeper[currTimeStamp] = RecordKeeper[currTimeStamp] + 1;
return currTimeStamp.toString() + RecordKeeper[currTimeStamp];
}
else {
RecordKeeper[currTimeStamp] = 1;
return currTimeStamp.toString() + RecordKeeper[currTimeStamp];
}
}
This uses timestamp (every millisecond) to always generate a unique number.
you can do this. Have a public array of keys that you have used and check against them with this function:
function in_array(needle, haystack)
{
for(var key in haystack)
{
if(needle === haystack[key])
{
return true;
}
}
return false;
}
(function from: javascript function inArray)
So what you can do is:
var done = [];
setInterval(function() {
var m = null;
while(m == null || in_array(m, done)){
m = Math.floor(Math.random()*7);
}
done.push(m);
$('.foo:nth-of-type('+m+')').fadeIn(300);
}, 300);
This code will get stuck after getting all seven numbers so you need to make sure it exists after it fins them all.

How to quickly find all colors in an image

I need to analyze an image to find all colors in a PNG or GIF image. I currently load the image to a canvas, then get the image data, then loop through every pixel and check it against each color in the palette. It takes forever, the browser thinks the script has stopped and it sometimes just crashes. Hoping there is a better way.
//load file
var fileReader = new FileReader();
fileReader.onload = function(e) {
var img = new Image();
img.onload = function() {
//create a new pixel with the images dimentions
newPixel(this.width, this.height, []);
//draw the image onto the canvas
context.drawImage(img, 0, 0);
var colorPalette = [];
var imagePixelData = context.getImageData(0,0,this.width, this.height).data;
console.log(imagePixelData)
for (var i = 0; i < imagePixelData.length; i += 4) {
var color = rgbToHex(imagePixelData[i],imagePixelData[i + 1],imagePixelData[i + 2]);
if (colorPalette.indexOf(color) == -1) {
colorPalette.push(color);
//don't allow more than 256 colors to be added
if (colorPalette.length >= settings.maxColorsOnImportedImage) {
alert('The image loaded seems to have more than '+settings.maxColorsOnImportedImage+' colors.')
break;
}
}
}
createColorPalette(colorPalette, false);
//track google event
ga('send', 'event', 'Pixel Editor Load', colorPalette.length, this.width+'/'+this.height); /*global ga*/
};
img.src = e.target.result;
};
fileReader.readAsDataURL(this.files[0]);
This will speed things up a little.
The function indexof will search all of the array if it can not find an entry. This can be very slow, you can improve the speed of the search by adding to the top of the array moving down then searching for the index from the top most entry. This ensures that the search is only over added entries not the entire array.
Also use 32Bit words rather than 8bit bytes, use typed arrays as they are significantly quicker that pushing onto a array and don't convert to hex inside the loop it is redundant and can be done after on the smaller data set.
See comments for logic.
//draw the image onto the canvas
ctx.drawImage(image, 0, 0);
// get size
const size = image.width * image.height;
// create pallete buffer
const colors32 = new Uint32Array(256);
// current pos of palette entry
var palettePos = colors32.length - 1; // Start from top as this will speed up indexOf function
// pixel data as 32 bit words so you can handle a pixel at a time rather than bytes.
const imgData = new Uint32Array(ctx.getImageData(0, 0, image.width, image.height).data.buffer);;
// hold the color. If the images are low colour (less 256) it is highly probable that many pixels will
// be the same as the previous. You can avoid the index search if this is the case.
var color= colors32[palettePos --] = imgData[0]; // assign first pixels to ease logic in loop
for (var i = 1; i < size && palettePos >= 0; i += 1) { // loop till al pixels read if palette full
if(color !== imgData[i]){ // is different than previouse
if (colors32.indexOf(imgData[i], palettePos) === -1) { // is in the pallet
color = colors32[palettePos --] = imgData[i]; // add it
}
}
}
// all the performance issues are over so now convert to the palette format you wanted.
const colorPalette = [];
colors32.reverse();
const paletteSize = (255 - palettePos) * 4;
const colors8 = new Uint8Array(colors32.buffer);
for(i = 0; i < paletteSize; i += 4){
colorPalette.push(rgbToHex(colors8[i],colors8[i + 1],colors8[i + 2]));
}

Javascript: For loop extremely slow, any way to speed it up?

I have a for-loop from 0 to 8,019,000,000 that is extremely slow.
var totalCalcs = 0;
for (var i = 0; i < 8019000000; i++)
totalCalcs++;
window.alert(totalCalcs);
in chrome this takes between 30-60secs.
I also already tried variations like:
var totalCalcs = 0;
for (var i = 8019000000; i--; )
totalCalcs++;
window.alert(totalCalcs);
Didn't do too much difference unfortunately.
Is there anything I can do to speed this up?
Your example is rather trivial, and any answer may not be suitable for whatever code you're actually putting inside of a loop with that many iterations.
If your work can be done in parallel, then we can divide the work between several web workers.
You can read a nice introduction to web workers, and learn how to use them, here:
http://www.html5rocks.com/en/tutorials/workers/basics/
Figuring out how to divide the work is a challenge that depends entirely on what that work is. Because your example is so small, it's easy to divide the work among inline web workers; here is a function to create a worker that will invoke a function asynchronously:
var makeWorker = function (fn, args, callback) {
var fnString = 'self.addEventListener("message", function (e) {self.postMessage((' + fn.toString() + ').apply(this, e.data))});',
blob = new Blob([fnString], { type: 'text/javascript' }),
url = URL.createObjectURL(blob),
worker = new Worker(url);
worker.postMessage(args);
worker.addEventListener('message', function (e) {
URL.revokeObjectURL(url);
callback(e.data);
});
return worker;
};
The work that we want done is adding numbers, so here is a function to do that:
var calculateSubTotal = function (count) {
var sum = 0;
for (var i = 0; i < count; ++i) {
sum++;
}
return sum;
};
And when a worker finishes, we want to add his sum to the total AND tell us the result when all workers are finished, so here is our callback:
var total = 0, count = 0, numWorkers = 1,
workerFinished = function (subTotal) {
total += subTotal;
count++;
if (count == numWorkers) {
console.log(total);
}
};
And finally we can create a worker:
makeWorker(calculateSubTotal, [10], workerFinished); // logs `10` to console
When put together, these pieces can calculate your large sum quickly (depending on how many CPUs your computer has, of course).
I have a complete example on jsfiddle.
Treating your question as a more generic question about speeding up loops with many iterations: you could try Duff's device.
In a test using nodejs the following code decreased the loop time from 108 seconds for your second loop (i--) to 27 seconds
var testVal = 0, iterations = 8019000000;
var n = iterations % 8;
while (n--) {
testVal++;
}
n = parseInt(iterations / 8);
while (n--) {
testVal++;
testVal++;
testVal++;
testVal++;
testVal++;
testVal++;
testVal++;
testVal++;
}

merging / layering multiple ArrayBuffers into one AudioBuffer using Web Audio API

I need to layer looping .wav tracks that ultimately I will need to be able to turn on and off and keep in sync.
First I load the tracks and stopped BufferLoader from turning the loaded arraybuffer into an AudioBuffer (hence the false)
function loadTracks(data) {
for (var i = 0; i < data.length; i++) {
trackUrls.push(data[i]['url']);
};
bufferLoader = new BufferLoader(context, trackUrls, finishedLoading);
bufferLoader.load(false);
return loaderDefered.promise;
}
When you click a button on screen it calls startStop().
function startStop(index, name, isPlaying) {
if(!activeBuffer) {
activeBuffer = bufferList[index];
}else{
activeBuffer = appendBuffer(activeBuffer, bufferList[index]);
}
context.decodeAudioData(activeBuffer, function(buffer){
audioBuffer = buffer;
play();
})
function play() {
var scheduledTime = 0.015;
try {
audioSource.stop(scheduledTime);
} catch (e) {}
audioSource = context.createBufferSource();
audioSource.buffer = audioBuffer;
audioSource.loop = true;
audioSource.connect(context.destination);
var currentTime = context.currentTime + 0.010 || 0;
audioSource.start(scheduledTime - 0.005, currentTime, audioBuffer.duration - currentTime);
audioSource.playbackRate.value = 1;
}
Most of the code I found on this guys github.
In the demo you can hear he is layering AudioBuffers.
I have tried the same on my hosting.
Disregarding the argularJS stuff, the Web Audio stuff is happening on the service.js at:
/js/angular/service.js
If you open the console and click the buttons you can see the activeBuffer.byteLength (type ArrayBuffer) is incrementing, however even after being decoded by the context.decodeAudioData method it still only plays the first sound you clicked instead of a merged AudioBuffer
I'm not sure I totally understand your scenario - don't you want these to be playing simultaneously? (i.e. bass gets layered on top of the drums).
Your current code is trying to concatenate an additional audio file whenever you hit the button for that file. You can't just concatenate audio files (in their ENCODED form) and then run it through decode - the decodeAudioData method is decoding the first complete sound in the arraybuffer, then stopping (because it's done decoding the sound).
What you should do is change the logic to concatenate the buffer data from the resulting AudioBuffers (see below). Even this logic isn't QUITE what you should do - this is still caching the encoded audio files, and decoding every time you hit the button. Instead, you should cache the decoded audio buffers, and just concatenate it.
function startStop(index, name, isPlaying) {
// Note we're decoding just the new sound
context.decodeAudioData( bufferList[index], function(buffer){
// We have a decoded buffer - now we need to concatenate it
audioBuffer = buffer;
if(!audioBuffer) {
audioBuffer = buffer;
}else{
audioBuffer = concatenateAudioBuffers(audioBuffer, buffer);
}
play();
})
}
function concatenateAudioBuffers(buffer1, buffer2) {
if (!buffer1 || !buffer2) {
console.log("no buffers!");
return null;
}
if (buffer1.numberOfChannels != buffer2.numberOfChannels) {
console.log("number of channels is not the same!");
return null;
}
if (buffer1.sampleRate != buffer2.sampleRate) {
console.log("sample rates don't match!");
return null;
}
var tmp = context.createBuffer(buffer1.numberOfChannels, buffer1.length + buffer2.length, buffer1.sampleRate);
for (var i=0; i<tmp.numberOfChannels; i++) {
var data = tmp.getChannelData(i);
data.set(buffer1.getChannelData(i));
data.set(buffer2.getChannelData(i),buffer1.length);
}
return tmp;
};
SOLVED:
To get multiple loops of the same duration playing at the same time and keep in sync even when you start and stop them randomly.
First, create all your buffer sources where bufferList is an array of AudioBuffers and the first sound is a sound you are going to read from and overwrite with your other sounds.
function createAllBufferSources() {
for (var i = 0; i < bufferList.length; i++) {
var source = context.createBufferSource();
source.buffer = bufferList[i];
source.loop = true;
bufferSources.push(source);
};
console.log(bufferSources)
}
Then:
function start() {
var rewrite = bufferSources[0];
rewrite.connect(context.destination);
var processNode = context.createScriptProcessor(2048, 2, 2);
rewrite.connect(processNode)
processNode.onaudioprocess = function(e) {
//getting the left and right of the sound we want to overwrite
var left = rewrite.buffer.getChannelData(0);
var right = rewrite.buffer.getChannelData(1);
var overL = [],
overR = [],
i, a, b, l;
l = bufferList.length,
//storing all the loops channel data
for (i = 0; i < l; i++) {
overL[i] = bufferList[i].getChannelData(0);
overR[i] = bufferList[i].getChannelData(1);
}
//looping through the channel data of the sound we are going to overwrite
a = 0, b = overL.length, l = left.length;
for (i = 0; i < l; i++) {
//making sure its a blank before we start to write
left[i] -= left[i];
right[i] -= right[i];
//looping through all the sounds we want to add and assigning the bytes to the old sound, both at the same position
for (a = 0; a < b; a++) {
left[i] += overL[a][i];
right[i] += overR[a][i];
}
left[i] /= b;
right[i] /= b);
}
};
processNode.connect(context.destination);
rewrite.start(0)
}
If you remove a AudioBuffer from bufferList and add it again at any point, it will always be in sync.
EDIT:
Keep in mind that:
-processor node gets garbage collected weirdly.
-This is very taxing, might want to think about using WebWorkers somehow

Categories

Resources