why audio visualizing a livestream is not working on mobile / Safari? - javascript

I'm trying to make an audio livestream visualizer based on the three.js example:
https://threejs.org/examples/?q=visua#webaudio_visualizer
It does not work on Safari or iPhone mobiles (Safari, Chrome).
Using an mp3 file instead of a livestream works in all devices.
var listener = new THREE.AudioListener();
var audio = new THREE.Audio( listener );
// not working on iPhone (Chrome, or Safari) or Safari Desktop
var mediaElement = new Audio( 'https://c2.radioboss.fm:18071/stream' );
//this works ok everywhere:
//var mediaElement = new Audio( 'https://raw.githubusercontent.com/zadvorsky/three.bas/master/examples/_audio/song.mp3' );
mediaElement.crossOrigin = "anonymous";
mediaElement.loop = true;
mediaElement.play();
audio.setMediaElementSource( mediaElement );
analyser = new THREE.AudioAnalyser( audio, fftSize );
https://codepen.io/pesinasiller/pen/Pvevry
(lines 23-24)
There is no error message, but the audio data from the analizer is always 0 on mobile.

Related

MediaRecorder.start() failing without errors on Ios Chrome and Safari. Working on desktop chrome / ff

I have a mobile website that takes a few pictures (environment cam) and records a short video (user cam). On desktop, everything works fine. On mobile, the camera feed is shown on both chrome and Safari. Taking pictures also works, but when I try to start recording, the page does not execute any javascript code after mediarecorder.start(1000). This means instructions are not shown and the vid never stops recording.
Code:
async function start()
{
var constraints = { video: { width: { ideal: 4096 }, height: { ideal: 2160 }, facingMode: 'user'}};
cameraStream = await navigator.mediaDevices.getUserMedia(constraints);
video.srcObject = cameraStream; video.play();
mediaRecorder = new MediaRecorder(cameraStream,{ mimeType: 'video/webm' });
mediaRecorder.addEventListener('dataavailable', function(e) {
chunks.push(e.data);
});
}
function startRecording()
{
console.log("starting recording")
takePicture();
console.log("Selfie taken")
outline.style.display = 'none';
button.style.display = 'none';
text.innerText = "Volg de instructies op het scherm.";
//WORKS FINE TILL HERE
mediaRecorder.start(1000);
//BELOW THIS IS NEVER EXECUTED
console.log("setting timeout");
setTimeout(step,2000);
}
As said, it works on desktop, but not on iOS chrome or Safari.
Chrome and Safari on iOS are unfortunately more or less the same. Apple only allows their own browser engine on iOS and Chrome plays by those rules.
The MediaRecorder in Safari doesn't support 'video/webm' which is why I guess there is already an error thrown when you construct the MediaRecorder. Consequently the mediaRecorder variable is undefined when you try to call start() later on.
It probably works if you let Safari (or Chrome on iOS) pick the mimeType itself by omitting the configuration.
mediaRecorder = new MediaRecorder(cameraStream);

Can't play HTML5 Video with Blob source on iOS devices

I have a React web app that gets the video URL from a server, then requests the video as a blob and tries to play this on an HTML video tag. I'm doing this because the client sometimes has issues with the internet and videos can stutter while playing, they'd rather wait longer for the video to load and then play it smoothly than have a choppy video. (I'm also saving the blob to IndexedDB as cache, but that's not related to the issue I'm having now, I'm just adding this as context but it has been disabled while I try to figure out this iOS problem)
I have a function to download the video, which then returns the Blob and a URL created from that blob object.
async function downloadVideo(videoUrl) {
return new Promise(function(resolve, reject) {
var req = new XMLHttpRequest();
req.open('GET', videoUrl, true);
req.responseType = 'blob';
req.onload = function() {
// Onload is triggered even on 404
// so we need to check the status code
if (this.status === 200) {
var videoBlob = new Blob([this.response], { type: 'video/mp4' });
console.log('Video blob?', videoBlob);
var vid = { objBlob: videoBlob, vidURL: URL.createObjectURL(videoBlob) };
// Video is now downloaded and converted into ObjectURL
resolve(vid);
} else {
reject('Video download failed with status ', this.status);
}
};
req.onerror = function() {
reject('Unable to Download Video');
};
req.send();
});
}
And then I have the element that plays the blob video:
<video
muted={true}
autoPlay={true}
loop={false}
onError={err => {
alert('Video load error. ' + err.target.error.iosMessage);
}}
src={downloadedVideo.url}
/>
That downloadedVideo.url is the blob object URL created on the DownloadVideo function
All of this works fine on desktop (Linux) and on Android, but the video doesn't play from the Blob on iOS devices. I've tried Safari, Chrome, and Firefox and the problem is the same.
on iOS I can get the video Blob and create an URL from it, but when I pass it as src it doesn't work, all I can get from the error (a MediaError object) is the code, 4, but the message is undefined.
If instead of the blob I pass the original video URL as src, it works on all devices, but then I can't cache the video and this feature will have to be dropped.
I've tried several videos and made sure encoding was compatible with iOS.
I could not find anything stating that iOS is not compatible with Blob URLs for video, so this should work, but I can't figure out why it doesn't.
Save the captured video with the type of "mp4"
IMPORTANT >>> new Blob(vid, {type: "video/mp4", })
const blobb = await new Blob(vid, {type: "video/mp4", }); // Important Line
I have a similar problem, but only on iOS 15.x (it works fine till iOS 14.5)
I think it's a bug on iOS 15.x - see also https://developer.apple.com/forums/thread/693447
There is a bug in webkit on iOS 15 builds, that do not include byte range headers to blob urls.
See: https://bugs.webkit.org/show_bug.cgi?id=232076 and Safari on iOS 15 inline video playback issues
As noted in the webkit issue, there is a workaround using a service worker.
https://bug-232076-attachments.webkit.org/attachment.cgi?id=442151
Mind though, service workers do not work in WKWebView.

safari recorded video speed up issue in angular 8

I have implemented the WebRTC in my angular project to record the video. And after the save we can send it to the attachment. This is working fine in windows OS properly, but in mac safari, the video is speed up and 30-sec video becomes 3 sec only. this occurs only in safari.
Here on start the video.
mediaDevices.getUserMedia({ video: true, audio: true })
.then(webcamStream => {
this.webcamStream = webcamStream;
})
The MediaRecorder code:
this.recorder = new MediaRecorder(this.webcamStream, {mimeType: 'video/mp4'});
this.recorder.onstart = () =>
this.zone.run(() => {
this.behaviorService.isRecording(true);
});
this.recorder.onstop = this.onRecorderStopped;
this.recorder.ondataavailable = (event) =>
this.zone.run(() => {
this.data = [...this.data, event.data];
});
this.recorder.start();
When video is stopped then it save in video/webm;codecs=h264 this mimeType.
I have also tried with video/mp4 but it also not working
Can I get any solution that works in both OS?
Safari is notoriously broken with respect to .getUserMedia() and the MediaRecorder class.
Can i get the any solution which works in both OS?
Not yet. Pester Apple. In the meantime use Chrome on MacOS: it works.
There may be some tricks to recommend to make this better. But you didn't show us your MediaRecorder code: that's where the stream is compressed.

iOS Safari: audio only recording noise

I'm developing an application that allows voice recording in the web browser. This is working fine on most browsers but I have some issues with iOS Safari.
Below you can find an extract of the code, it is not complete but it gives an idea of what's going on.
//Triggered when the user clicks on a button that start the recording
function startRecording() {
//Create new audio context
let audioContext = new (window.AudioContext || window.webkitAudioContext);
//Hack polyfill media recorder to re-use the audioContex
window.NewMediaRecorder.changeAudioContext(audioContext);
navigator.mediaDevices.enumerateDevices().then(function (devices) {
console.log('loaded audio devices');
console.log(devices);
devices = devices.filter((d) => d.kind === 'audioinput');
console.log(devices);
console.log('chosen device: ' + devices[0].deviceId);
navigator.mediaDevices.getUserMedia({
audio: {
deviceId : {
exact : devices[0].deviceId
}
}
}).then(function (stream) {
console.log(stream);
let recorder = new NewMediaRecorder(stream);
recorder.addEventListener('dataavailable', function (e) {
document.getElementById('ctrlAudio').src = URL.createObjectURL(e.data);
});
recorder.start();
console.log('stop listening after 15 seconds');
setTimeout(function () {
console.log('15 seconds passed');
console.log("Force stop listening");
recorder.stop();
recorder.stream.getTracks()[0].stop();
}, 15000);
});
});
}
For the record, I'm using audio recorder polyfill (https://ai.github.io/audio-recorder-polyfill/) in order to achieve recording, as MediaRecorder is not yet available on Safari.
The recorder works fine on all navigators (this including OS X Safari), yet on iOS Safari it only records noice. If I set the volume of my speakers at maximal level I can hear myself speak, but it is from "very far away".
All the online dictaphones/recorders that I found have the same issue, they always recording noise. (Tested with an iPhone 5S, 5SE and X, all up to date).
I'm a bit desperate because I already did a lot of research, but I didn't find any solution for this issue.
As required, the AudioContext is created on a user event (in this case a touch on a button).
I even tried to change the gain of but that didn't help.
Trying to access the audio without setting a media device isn't helping.
navigator.mediaDevices.getUserMedia({audio: true})

No sound from Web Audio in chrome (currentTime always 0)

My tablet is running Chrome 52.0.2743.98 but will not output sound when I go to this Web Audio example page.
When I inspect the audio context in the console, I can see that the currentTime is always 0.
Pasting the following code from MDN also produces no sound:
var audioCtx = new (window.AudioContext || window.webkitAudioContext)();
var oscillator = audioCtx.createOscillator();
oscillator.type = 'square';
oscillator.frequency.value = 3000; // value in hertz
oscillator.connect(audioCtx.destination);
oscillator.start();
These two examples work well on my laptop with Chrome 52.0.2743.116.
How can I get Chrome to output sound from the Web Audio API?
For Chrome Android, my recollection is that audio will only start if attached to a user interaction (e.g. a touch or click event). See also https://bugs.chromium.org/p/chromium/issues/detail?id=178297

Categories

Resources