My tablet is running Chrome 52.0.2743.98 but will not output sound when I go to this Web Audio example page.
When I inspect the audio context in the console, I can see that the currentTime is always 0.
Pasting the following code from MDN also produces no sound:
var audioCtx = new (window.AudioContext || window.webkitAudioContext)();
var oscillator = audioCtx.createOscillator();
oscillator.type = 'square';
oscillator.frequency.value = 3000; // value in hertz
oscillator.connect(audioCtx.destination);
oscillator.start();
These two examples work well on my laptop with Chrome 52.0.2743.116.
How can I get Chrome to output sound from the Web Audio API?
For Chrome Android, my recollection is that audio will only start if attached to a user interaction (e.g. a touch or click event). See also https://bugs.chromium.org/p/chromium/issues/detail?id=178297
Related
I'm working on a music web app that has a piano keyboard. When a user presses a piano key, I'm using OscillatorNode to play a brief tone corresponding to the key:
const audioCtx = new (window.AudioContext || window.webkitAudioContext)();
function playNote(note) {
let oscillator;
let freq = notes[note];
console.debug(note + " (" + freq + " Hz)");
oscillator = audioCtx.createOscillator(); // create Oscillator node
oscillator.type = wavetypeEl.val(); // triangle wave by default
oscillator.frequency.setValueAtTime(freq, audioCtx.currentTime); // freq = value in hertz
oscillator.connect(audioCtx.destination);
oscillator.start();
oscillator.stop(audioCtx.currentTime + 0.5);
}
$('#keyboard button').on('click', (e) => {
playNote(e.target.dataset.note);
});
This works on all the desktop and Android browsers I've tried, but iOS stubbornly refuses to play any sound. I see that I need a user interaction to "unlock" an AudioContext on iOS, but I would have thought calling playNote() from my click function would have done the trick.
According to Apple, I should be able to use noteOn() on my oscillator object, instead of oscillator.start() the way I've got it in my example. But that doesn't seem to be a valid method.
I must be missing something simple here. Anybody know?
If everything seems to be working fine it could be that the device itself is on mute. For some reason (or for no reason at all) Safari doesn't play any sound coming from the Web Audio API when the device is muted. But it plays everything else.
There are some hacky ways to circumvent this bug which basically work by playing something with an audio element first before using the Web Audio API.
unmute-ios-audio is for example a library which implements the hack mentioned above.
I have an iPhone XS Max and the demo works on it producing sound as you have it right now... I've also read that for iOS the element needs both onClick handler and a style of {cursor: pointer} set to work properly.(as of a few years ago) seems like it's working though.
I'm developing an application that allows voice recording in the web browser. This is working fine on most browsers but I have some issues with iOS Safari.
Below you can find an extract of the code, it is not complete but it gives an idea of what's going on.
//Triggered when the user clicks on a button that start the recording
function startRecording() {
//Create new audio context
let audioContext = new (window.AudioContext || window.webkitAudioContext);
//Hack polyfill media recorder to re-use the audioContex
window.NewMediaRecorder.changeAudioContext(audioContext);
navigator.mediaDevices.enumerateDevices().then(function (devices) {
console.log('loaded audio devices');
console.log(devices);
devices = devices.filter((d) => d.kind === 'audioinput');
console.log(devices);
console.log('chosen device: ' + devices[0].deviceId);
navigator.mediaDevices.getUserMedia({
audio: {
deviceId : {
exact : devices[0].deviceId
}
}
}).then(function (stream) {
console.log(stream);
let recorder = new NewMediaRecorder(stream);
recorder.addEventListener('dataavailable', function (e) {
document.getElementById('ctrlAudio').src = URL.createObjectURL(e.data);
});
recorder.start();
console.log('stop listening after 15 seconds');
setTimeout(function () {
console.log('15 seconds passed');
console.log("Force stop listening");
recorder.stop();
recorder.stream.getTracks()[0].stop();
}, 15000);
});
});
}
For the record, I'm using audio recorder polyfill (https://ai.github.io/audio-recorder-polyfill/) in order to achieve recording, as MediaRecorder is not yet available on Safari.
The recorder works fine on all navigators (this including OS X Safari), yet on iOS Safari it only records noice. If I set the volume of my speakers at maximal level I can hear myself speak, but it is from "very far away".
All the online dictaphones/recorders that I found have the same issue, they always recording noise. (Tested with an iPhone 5S, 5SE and X, all up to date).
I'm a bit desperate because I already did a lot of research, but I didn't find any solution for this issue.
As required, the AudioContext is created on a user event (in this case a touch on a button).
I even tried to change the gain of but that didn't help.
Trying to access the audio without setting a media device isn't helping.
navigator.mediaDevices.getUserMedia({audio: true})
I created an audio object and want to play it when user leave the window. So, my code is here:
$('body').on('mouseleave', function(){
var audio = new Audio( 'quite-impressed.mp3' )
audio.play()
})
It works well in Firefox. It also works in Chrome if I click in the page and then leave mouse outside of the body. But, when I leave the mouse without clicking in the page an error showed in the console and the audio does not play
Uncaught (in promise) DOMException: play() failed because the user didn't interact with the document first. https://developers.google.com/web/updates/2017/09/autoplay-policy-changes
But, in this example site it works fine without interacting with the page. How can I make it possible? Thanks in advance.
It seems they use an AudioContext to play that sound.
Chrome did came back a few months ago about blocking the AudioContext API, because a lot of fair uses were not prepared for such restriction and thus got broken by it.
But M71, which will get released in December 2018 will reenable that restriction, you can read about it here: https://developers.google.com/web/updates/2017/09/autoplay-policy-changes#webaudio
// this will work in Chrome < 70 but not af
onmouseout = e => {
const ctx = new AudioContext();
const osc = ctx.createOscillator();
osc.connect(ctx.destination);
osc.start(0);
osc.stop(1);
}
Outsourced live example, since Stacksnippets are always granted the user gesture anyway: https://jsfiddle.net/zy3ev8ka/
Try this:
window.audio = new Audio( 'quite-impressed.mp3' )
$('body').on('mouseleave', function(){
audio.play()
})
This worked for me on Chrome 77:
On address bar: chrome://settings/content/sound
Turn off "Allow sites to play sound (recommended)"
Turn it on again
if your js play function is not running and if your code is correct then this may help,you have to allow your browser to get that sound file
just goto settings =>privacy and security => site-settings =>sound.
and then add your local url at add to play section..
I want to develop a web App for mobile phones that records audio from the microphone and plays music at the same time.
With getUserMedia() I get the stream and create in my AudioContext a MediaStreamSource. At the same time I create a BufferSource which plays music. In Chrome this setup works. But when I start the same Web app in Chrome on my Nexus 5 and I allow it to use the microphone the music is muted.
Success Callback for getUserMedia:
function gotStream(stream) {
mediaStreamSource = audioContext.createMediaStreamSource(stream);
meter = createAudioMeter(audioContext);
mediaStreamSource.connect(meter);
info = document.getElementById('info');
outputData();
}
Play Music Function:
function playSound(buffer) {
source = audioContext.createBufferSource();
source.buffer = buffer;
gainNode = audioContext.createGain();
source.connect(gainNode);
gainNode.connect(audioContext.destination);
source.start(0);
}
Should that be the expected behaviour or am I doing something wrong?
I'm talking about feedback - when you make a simple javascript application that opens a stream from the user and reads the frequency analysis (or whatever is it) it thorws all received data back to the headphones in both Google Chrome and Opera. Firefox is silent most of the time and randomly creates a huge mess with unstable feedback - it also closes the stream after few seconds. Generally the thing doesn't work in Firefox yet.
I created a fiddle. If your browser doesn't support it you'll just get error in the console I assume.
The critical part of the code is the function that is called when user accepts the request for the microphone access:
//Not sure why do I do this
var inputPoint = context.createGain();
// Create an AudioNode from the stream.
var source = context.createMediaStreamSource(stream);
source.connect(inputPoint);
//Analyser - this converts raw data into spectral analysis
window.analyser = context.createAnalyser();
//Mores stuff I know nothing about
analyser.fftSize = 2048;
//Sounds much like connecting nodes in MatLab, doesn't it?
inputPoint.connect(analyser);
analyser.connect(context.destination);
///THIS should probably make the sound silent (gain:0) but it doesn't
var zeroGain = context.createGain();
zeroGain.gain.value = 0.0;
//More connecting... are you already lost which node is which? Because I am.
inputPoint.connect(zeroGain);
zeroGain.connect(context.destination);
Zero gain idea is not mine, I have stolen it from simple sound recorder demo. But what works for them doesn't work for me.
The demo has also no problems in Firefox, like I do.
in function mediaGranted(stream) {...
comment out
..
Fiddle line #46: //analyser.connect(context.destination);
..
more info https://mdn.mozillademos.org/files/5081/WebAudioBasics.png
nice demo: http://mdn.github.io/voice-change-o-matic/