WebAudio: Oscillating AudioBufferSource playbackRate - javascript

I'm trying to apply a pitch vibrato to an AudioBufferSource using an oscillator.
var source = context.createBufferSource();
source.connect(context.destination);
source.buffer = buffer;
source.loop = true;
source.start(0);
// None of the below seems to have any effect in Chrome
var osc = context.createOscillator();
osc.type = "sine";
osc.frequency.value = 0.5;
osc.start(0);
var osc_gain = context.createGain();
osc_gain.connect(source.playbackRate);
osc.connect(osc_gain);
// osc_gain.gain.value = 0.1 doesn't work
osc_gain.gain.setValueAtTime(0.1, 0);
Here's a fiddle. http://jsfiddle.net/HRkcE/12/
The oscillator doesn't seem to have any effect in Chrome, but is working in Firefox (once I figured out that setting osc_gain.gain.value directly doesn't work).
Am I doing anything wrong to make it not work in Chrome?

No, you're not doing anything wrong. Blink has a bug where we do not support this, which was reported to me by someone else just last week, and I filed: https://code.google.com/p/chromium/issues/detail?id=311284. We will get that fixed.
In the meantime, it's actually relatively easy to do a vibrato effect on ANY audio connection (not just buffersourcenodes) using an LFO to drive oscillations on the delayTime of a delayNode - check out the "Vibrato" effect I added to the end of http://webaudiodemos.appspot.com/input/index.html, and the node chain I set up to do it: https://github.com/cwilso/Audio-Input-Effects/blob/master/js/effects.js#L478 is the vibrato subgraph creation routine.

Related

WebAudio LinearRampToValueAtTime isn't ramping

I've been working in WebAudio trying to ramp a oscillator node on and off. I've been trying to verify the ramping on the signal with an oscilloscope, and noticing that the linear ramp (from linearRampToValueAtTime() function) is not working as anticipated.
First, I'm unable to change the duration of the ramping and then when I manage to get a ramp, it is not linear (using setValueAtTime() function).
So I guess my question is this - has anyone been able to validate using external equipment that the ramps are working in Chrome?
I tried verifying the example listed on the MDN documentation site:
https://mdn.github.io/webaudio-examples/audio-param/
but those demos also don't ramp - is this an issue with my audio driver, web browser, or some other issue? I'm assuming there's something wrong in the implementation, because it's not working anywhere, and others don't seem to have this issue so any help I can get would be appreciated!
snippets of the code below:
Creating nodes
for (i = 0; i < 2; i++) {
oscCarrier[i] = audioCtx.createOscillator();
panOsc[i] = audioCtx.createStereoPanner();
gainOsc[i] = audioCtx.createGain();
oscCarrier[i].frequency.value = 1000;
gainOsc[i].gain.setValueAtTime(0,audioCtx.currentTime);
oscCarrier[i].connect(panOsc[i]);
panOsc[i].connect(gainOsc[i])
gainOsc[i].connect(audioCtx.destination)
oscCarrier[i].start();
}
for ramping off:
let rampTime = .025 //(25 ms ramp off)
gainOsc[channel].gain.linearRampToValueAtTime(0, audioCtx.currentTime + rampTime)
for ramping on:
let rampTime = .025 //(25 ms ramp on)
gainOsc[channel].gain.linearRampToValueAtTime(0, audioCtx.currentTime + rampTime)

stream.addTrack working in Firefox but not Chrome

I'm having an issue with a project i'm working on for my brother's company and wanted to see if anybody had any ideas on how to fix it.
The company puts on virtual conferences for corporate events, nonprofits, etc. and its common with these things for people to need to pre-record their presentations with a powerpoint slideshow. For non-tech savvy clients (which is most of their clients), the only way to make this happen is to hire a contractor to work with with the client 1 on 1 to get their video recorded. My goal is to make a site where clients can easily record the video and slideshow themself, which would reduce their costs significantly.
Its still very much a work in progress, but what i have so far is live here: https://ezav-redesign-hf4d3.ondigitalocean.app/
The issue is, while the recording function works really well on Firefox, on Chrome it always ends up with no audio. You can see the full source code at the live site, but here is what I think should be the relevant part:
const video = document.createElement("video");
video.muted = true;
video.srcObject = stream;
video.play();
function update() {
ctx.drawImage(video, 300, 0, 720, 720, 1547, 297, 357, 355);
ctx.drawImage(
presentationCanvas,
0,
0,
presentationCanvas.width,
presentationCanvas.height,
20,
125,
presentationCanvas.width * 0.754,
presentationCanvas.height * 0.745
);
requestAnimationFrame(update); // wait for the browser to be ready to present another animation fram.
}
video.addEventListener("loadeddata", function () {
update(); //Start rendering
});
/* RECORDING */
const recStream = canvas.captureStream(30);
var audioCtx = new AudioContext();
// create a stream from our AudioContext
var dest = audioCtx.createMediaStreamDestination();
audioStream = dest.stream;
// connect our video element's output to the stream
var sourceNode = audioCtx.createMediaElementSource(video);
sourceNode.connect(dest);
recStream.addTrack(audioStream.getAudioTracks()[0]);
I've spent most of the weekend googling and tinkering trying to fix this and have tested it on 3 different machines but i can't seem to get anywhere. Im not sure how to pinpoint exactly where its going wrong as the console isnt diplaying any errors. Any ideas would be greatly appreciated. Thanks!
This is a known bug/limitation in Chrome, where they don't pass the audio stream of muted MediaElements to the graph at all anymore.
Luckily in your case it seems you absolutely don't need to go through the MediaElement since you have access to the raw MediaStream. So all you have to do is to get rid of the AudioContext part entirely and just do
stream.getAudioTracks().forEach( (track) => recStream.addTrack( track ) );

JS Audio API - Oscillator inside function doesn't play sound

Using Opera 44.0, I was fiddling around with the Audio API and tried a simple example:
var ac = new AudioContext();
var osc = ac.createOscillator();
osc.connect(ac.destination);
osc.start();
osc.stop(2);
It works as expected, the sound is played for 2 seconds and then it stops.
Then I tried to step up a little, playing a sound when a button is clicked:
function play(){
var osc = ac.createOscillator();
osc.connect(ac.destination);
osc.start();
osc.stop(2);
}
var ac = new AudioContext();
var playBtn = document.querySelector("#play");
playBtn.addEventListener("click", play);
It doesn't work. The function is called when I click the button (I checked using console.log() inside the function), but no sound is played. I tried refreshing the page, restarting the browser.. nothing.
After some research I found out that the oscillator is thrown away when stop() is called, so I have to create a new oscillator every time. Pretty much all the examples I found revolve around that concept, which is why I am creating it inside the function. But with no errors whatsoever, I can't figure out why it's not working.
So, where is the problem here?
Digging through the documentation, I managed to solve the problem.
Originally, I assumed that the argument passed to osc.stop(2) was the playing time in seconds, something like "play for 2 seconds, then stop".
That's not correct though: the argument is "...the audio context time when the oscillator should stop".
By logging ac.currentTime inside the play() function, the value returned when I clicked on the button was ~5. So, what happened is that by passing 2 to osc.stop() i'm telling the osc to stop when the context time is 2, which is already passed!
The solution is simple:
function play(){
var osc = ac.createOscillator();
osc.connect(ac.destination);
osc.start();
//take into account the current time of the context
osc.stop(ac.currentTime + 2);
}
Words of wisdom now echo into my brain... R....T.....F...M...

Web Audio API demo doesn't work on iOS

I'm currently working on adapting this web audio API demo for a project that I am working on, but there is no sound when I test on an iPhone. It works fine on the iPad.
I've searched for solutions and found this thread on StackOverflow with the following snippet of one of the answers:
Safari on iOS 6 effectively starts with the Web Audio API muted. It
will not unmute until you attempt to play a sound in a user input
event (create a buffer source, connect it to destination, and call
noteOn()). After this, it unmutes and audio plays unrestricted and as
it ought to. This is an undocumented aspect of how the Web Audio API
works on iOS 6 (Apple's doc is here, hopefully they update it with a
mention of this soon!)
The user input event should be the onclick event on the play button but changing to use noteOn() instead of start() still doesn't fix it.
Update: I've also tried binding the play button with the touchend event but to no avail.
Here is the function that uses noteOn():
function playNote(buffer, pan, x, y, z, sendGain, mainGain, playbackRate, noteTime) {
// Create the note
var voice = context.createBufferSource();
voice.buffer = buffer;
voice.playbackRate.value = playbackRate;
// Optionally, connect to a panner
var finalNode;
if (pan) {
var panner = context.createPanner();
panner.panningModel = "HRTF";
panner.setPosition(x, y, z);
voice.connect(panner);
finalNode = panner;
} else {
finalNode = voice;
}
// Connect to dry mix
var dryGainNode = context.createGain();
dryGainNode.gain.value = mainGain * effectDryMix;
finalNode.connect(dryGainNode);
dryGainNode.connect(masterGainNode);
// Connect to wet mix
var wetGainNode = context.createGain();
wetGainNode.gain.value = sendGain;
finalNode.connect(wetGainNode);
wetGainNode.connect(convolver);
if (iOS) {
voice.noteOn(noteTime);
}
else {
voice.start(noteTime);
}
}
Any suggestions would be greatly appreciated. Thanks.
I feel really stupid. Apparently, if you have your iPhone on vibrate mode, the sound doesn't play.
The start() method should work fine without the if else statements on iOS as long as you call the function with a user interaction event. Also flip the order you pass y and z to the panner cause z is second for some strange reason.
Here's a working example, change stuff in it to fit what you need, most isn't need and I've got others somewhere that use the dom to add event listeners
<script>
var audioCtx = new (window.AudioContext || window.webkitAudioContext)();
var oscillator = audioCtx.createOscillator();
var gainNode = audioCtx.createGain();
oscillator.connect(gainNode);
gainNode.connect(audioCtx.destination);
oscillator.type = 'sine';
oscillator.frequency.value = 440;
gainNode.gain.value = 1;
</script>
<button onclick="oscillator.start();">play</button>
My own experience has been that sometimes the Web Audio API works on iPhones, sometimes it doesn't. Here is a page that worked 5 minutes ago on my iPhone 6s; 1 minute ago it didn't work; now it does again!
http://www.stephenandrewtaylor.net/dna-sonification/
Here is another one that works intermittently; it worked 2 minutes ago and now it doesn't (the animations work, there is just no audio).
http://www.stephenandrewtaylor.net/lie-sonification/
It might have to do with how many tabs are open in Safari; you could try closing some of your open tabs (right now I have 5 tabs, including the lie-sonificaton page which worked 2 minutes ago but now doesn't). I am also a novice programmer and I'm sure there are much better ways I could be writing the code.

Webaudio sound stops on Chrome for Android after about 2 minutes

I'm running into an issue with WebAudio on Chrome for Android.
I'm experiencing this on a Samsung Galaxy S3 (GT-I9300) with:
Chrome version 44.0.2403.133
Android 4.3.0
Here is the code I'm using to try and isolate the issue:
var audioContext;
if(window.AudioContext) {
audioContext = new AudioContext();
}
var startTime = Date.now();
var lastTrigger;
var gain = audioContext.createGain();
gain.gain.value = 1;
gain.connect(audioContext.destination);
var buttonTrigger = document.getElementById('trigger');
buttonTrigger.addEventListener('click', function(event) {
var oscillator = audioContext.createOscillator();
oscillator.type = "square";
oscillator.frequency.value = 100 + (Math.cos(audioContext.currentTime)*100);
oscillator.connect(gain);
oscillator.start(0);
oscillator.stop(audioContext.currentTime+0.1);
lastTrigger = Date.now();
});
var timer = document.getElementById('timer');
setInterval(function() {
if(lastTrigger) { timer.textContent = Date.now() - lastTrigger; }
}, 1000);
And here it is on jsfiddle
This simply creates an oscillator node and plays on clicking a button. On my phone, if I do not click the button for about a minute and a half or two minutes, I no longer get any sound.
There are no errors thrown.
Does anyone have any experience of this issue and a possible workaround?
This issue originally appeared in a much larger app using Phaser to play sounds from a m4a file, so this is not solely to do with the oscillator.
UPDATE
According to the Chromium bug ticket this issue has now been fixed.
After experiencing the same problem on Android. I found a better solution than playing a "dummy sound" every 30sec.
Just remember the time when you last played over your context:
var lastPlayed = new Date().getTime();
var audioSource = context.createBufferSource();
audioSource.connect( context.destination );
audioSource.buffer = sb;
audioSource.start( 0 );
The next time you play a sample/sound Just check the time passed and reset the AudioContext
if(new Date().getTime()-lastPlayed>30000){ // Time passed since last playing is greater than 30 secs
context.close();
context=new AudioContext();
}
For Android this works like charm.
I think what you're seeing is an auto-shutdown of Web Audio when there's no sound for a while. What happens if you click the button a second time, a second or so after the first? (Web Audio can take some time (order of tens of milliseconds, at least) to restart.)
The suspend()/resume() methods, and looking at the context.state, would be helpful here.
#RaymondToy's comment answers this question. There is a bug with Chrome on Android (at least for Samsung Galaxy S3/4). Webaudio stops playing sounds after a period of inactivity. Where inactivity is essentially silence.
The only work around I can find is to play some kind of sound at intervals. I have experimented with playing a sound every 30 seconds and that stopped the problem.
I also tried playing some kind of silent noise (silent audio buffer or silent part of an m4a audio file or muted sound), neither of which solved the problem.

Categories

Resources