Audio recording is empty on safari ios - javascript

I've used RecordRTC in order to record audio and send it to a speech-to-text API.
Somehow, it all works perfectly fine except for using Safari IOS.
While using Safari IOS, the recording which I'm retrieving as base64 string,
is somehow returned empty from the recorder object.
Previous questions asked about it were answered to use another library,
yet the docs for RecordRTC specifically says it fully supports Safari IOS.
Could you please help me figuring out the problem and finding a workaround?
My code:
async initMic() {
let stream = await navigator.mediaDevices.getUserMedia({video: false, audio: true});
mic = new RecordRTCPromisesHandler(stream, {
type: 'audio',
mimeType: 'audio/wav',
recorderType: RecordRTC.StereoAudioRecorder,
sampleRate: 48000,
numberOfAudioChannels: 1,
});
},
async sendRecording() {
let vm = this;
mic.stopRecording(function() {
mic.getDataURL(function(dataURL) {
vm.$store.dispatch('UpdateAudioBase64', dataURL.replace('data:audio/wav;base64,', ''));
mic.reset();
vm.$emit('send-recording');
});
});
},
** The string 'replace' function is meant to remove the base64 header
before sending it to speech-to-text API (API's needs).
Thank You!

If not mistaken, apple fu... messed up again with their dumb policy,
problem is you can't do a lot of things(like setting up recorder)
without USER trigger them,
so you should wrap your recorder in click event listener,
user click button, then your mic = new RecordRTCPromisesHandler(stream, {... etc
fires and recording starts.
check this example https://github.com/muaz-khan/RecordRTC/blob/master/simple-demos/audio-recording.html
here this trick works
btw your code works in mac safari?

Related

Can't play HTML5 Video with Blob source on iOS devices

I have a React web app that gets the video URL from a server, then requests the video as a blob and tries to play this on an HTML video tag. I'm doing this because the client sometimes has issues with the internet and videos can stutter while playing, they'd rather wait longer for the video to load and then play it smoothly than have a choppy video. (I'm also saving the blob to IndexedDB as cache, but that's not related to the issue I'm having now, I'm just adding this as context but it has been disabled while I try to figure out this iOS problem)
I have a function to download the video, which then returns the Blob and a URL created from that blob object.
async function downloadVideo(videoUrl) {
return new Promise(function(resolve, reject) {
var req = new XMLHttpRequest();
req.open('GET', videoUrl, true);
req.responseType = 'blob';
req.onload = function() {
// Onload is triggered even on 404
// so we need to check the status code
if (this.status === 200) {
var videoBlob = new Blob([this.response], { type: 'video/mp4' });
console.log('Video blob?', videoBlob);
var vid = { objBlob: videoBlob, vidURL: URL.createObjectURL(videoBlob) };
// Video is now downloaded and converted into ObjectURL
resolve(vid);
} else {
reject('Video download failed with status ', this.status);
}
};
req.onerror = function() {
reject('Unable to Download Video');
};
req.send();
});
}
And then I have the element that plays the blob video:
<video
muted={true}
autoPlay={true}
loop={false}
onError={err => {
alert('Video load error. ' + err.target.error.iosMessage);
}}
src={downloadedVideo.url}
/>
That downloadedVideo.url is the blob object URL created on the DownloadVideo function
All of this works fine on desktop (Linux) and on Android, but the video doesn't play from the Blob on iOS devices. I've tried Safari, Chrome, and Firefox and the problem is the same.
on iOS I can get the video Blob and create an URL from it, but when I pass it as src it doesn't work, all I can get from the error (a MediaError object) is the code, 4, but the message is undefined.
If instead of the blob I pass the original video URL as src, it works on all devices, but then I can't cache the video and this feature will have to be dropped.
I've tried several videos and made sure encoding was compatible with iOS.
I could not find anything stating that iOS is not compatible with Blob URLs for video, so this should work, but I can't figure out why it doesn't.
Save the captured video with the type of "mp4"
IMPORTANT >>> new Blob(vid, {type: "video/mp4", })
const blobb = await new Blob(vid, {type: "video/mp4", }); // Important Line
I have a similar problem, but only on iOS 15.x (it works fine till iOS 14.5)
I think it's a bug on iOS 15.x - see also https://developer.apple.com/forums/thread/693447
There is a bug in webkit on iOS 15 builds, that do not include byte range headers to blob urls.
See: https://bugs.webkit.org/show_bug.cgi?id=232076 and Safari on iOS 15 inline video playback issues
As noted in the webkit issue, there is a workaround using a service worker.
https://bug-232076-attachments.webkit.org/attachment.cgi?id=442151
Mind though, service workers do not work in WKWebView.

safari recorded video speed up issue in angular 8

I have implemented the WebRTC in my angular project to record the video. And after the save we can send it to the attachment. This is working fine in windows OS properly, but in mac safari, the video is speed up and 30-sec video becomes 3 sec only. this occurs only in safari.
Here on start the video.
mediaDevices.getUserMedia({ video: true, audio: true })
.then(webcamStream => {
this.webcamStream = webcamStream;
})
The MediaRecorder code:
this.recorder = new MediaRecorder(this.webcamStream, {mimeType: 'video/mp4'});
this.recorder.onstart = () =>
this.zone.run(() => {
this.behaviorService.isRecording(true);
});
this.recorder.onstop = this.onRecorderStopped;
this.recorder.ondataavailable = (event) =>
this.zone.run(() => {
this.data = [...this.data, event.data];
});
this.recorder.start();
When video is stopped then it save in video/webm;codecs=h264 this mimeType.
I have also tried with video/mp4 but it also not working
Can I get any solution that works in both OS?
Safari is notoriously broken with respect to .getUserMedia() and the MediaRecorder class.
Can i get the any solution which works in both OS?
Not yet. Pester Apple. In the meantime use Chrome on MacOS: it works.
There may be some tricks to recommend to make this better. But you didn't show us your MediaRecorder code: that's where the stream is compressed.

Capture from webcamera html

I want to capture video with the webcamera.
And there is the right decision:
window.onload = function () {
var video = document.getElementById('video');
var videoStreamUrl = false;
navigator.getUserMedia({video: true}, function (stream) {
videoStreamUrl = window.URL.createObjectURL(stream);
video.src = videoStreamUrl;
}, function () {
console.log('error');
});
};
but produces an error in the browser:
[Deprecation] URL.createObjectURL with media streams is deprecated and will be removed in M68, around July 2018. Please use HTMLMediaElement.srcObject instead. See https://www.chromestatus.com/features/5618491470118912 for more details.
how to use HTMLMediaElement.srcObject for my purposes ? Thanks for your time!
MediaElement.srcObject should allow Blobs, MediaSources and MediaStreams to be played in the MediaElement without the need to bind these sources in the memory for the lifetime of the document like blobURIs do.
(Currently no browser support anything else than MediaStream though...)
Indeed, when you do URL.createObjectURL(MediaStream), you are telling the browser that it should keep alive this Source until your revoke the blobURI, or until the document dies.
In the case of a LocalMediaStream served from a capturing device (camera or microphone), this also means that the browser has to keep the connection to this device open.
Firefox initiated the deprecation of this feature, one year or so ago, since srcObject can provide the same result in better ways, easier to handle for everyone, and hence Chrome seems to finally follow (not sure what's the specs status about this).
So to use it, simply do
MediaElement.srcObject = MediaStream;
Also note that the API you are using is itself deprecated (and not only in FF), and you shouldn't use it anymore. Indeed, the correct API to capture MediaStreams from user Media is the MediaDevices.getUserMedia one.
This API now returns a Promise which gets resolved to the MediaStream.
So a complete correction of your code would be
var video = document.getElementById('video');
navigator.mediaDevices.getUserMedia({
video: true
})
.then(function(stream) {
video.srcObject = stream;
})
.catch(function(error) {
console.log('error', error);
});
<video id="video"></video>
Or as a fiddle since StackSnippetsĀ® overprotected iframe may not deal well with gUM.

Sound analysis without getUserMedia

I am trying to analyse the audio output from the browser, but I don't want the getUserMedia prompt to appear (which asks for microphone permission).
The sound sources are SpeechSynthesis and an Mp3 file.
Here's my code:
return navigator.mediaDevices.getUserMedia({
audio: true
})
.then(stream => new Promise(resolve => {
const track = stream.getAudioTracks()[0];
this.mediaStream_.addTrack(track);
this._source = this.audioContext.createMediaStreamSource(this.mediaStream_);
this._source.connect(this.analyser);
this.draw(this);
}));
This code is working fine, but it's asking for permission to use the microphone! I a not interested at all in the microphone I only need to gauge the audio output. If I check all available devices:
navigator.mediaDevices.enumerateDevices()
.then(function(devices) {
devices.forEach(function(device) {
console.log(device.kind + ": " + device.label +
" id = " + device.deviceId);
});
})
I get a list of available devices in the browser, including 'audiooutput'.
So, is there a way to route the audio output in a media stream that can be then used inside 'createMediaStreamSource' function?
I have checked all the documentation for the audio API but could not find it.
Thanks for anyone that can help!
There are various ways to get a MediaStream which is originating from gUM, but you won't be able to catch all possible audio output...
But, for your mp3 file, if you read it through an MediaElement (<audio> or <video>), and if this file is served without breaking CORS, then you can use MediaElement.captureStream.
If you read it from WebAudioAPI, or if you target browsers that don't support captureStream, then you can use AudioContext.createMediaStreamDestination.
For SpeechSynthesis, unfortunately you will need gUM... and a Virtual Audio Device: first you would have to set your default output to the VAB_out, then route your VAB_out to VAB_in and finally grab VAB_in from gUM...
Not an easy nor universally doable task, moreover when IIRC SpeechSynthesis doesn't have any setSinkId method.

Stop/Close webcam using getUserMedia and RTCPeerConnection Chrome 25

I'm on Chrome 25 successfully using getUserMedia and RTCPeerConnection to connect audio from a web page to another party, but I'm unable to get the API to stop the red blinking indication icon in the Chrome tab that media is being used on that page. My question is essentially a duplicate of Stop/Close webcam which is opened by navigator.getUserMedia except that the resolution there isn't working. If I have a page that just uses getUserMedia with no remote media (no peer), then stopping the camera turns off the blinking tab indicator. Adding remote streams seems to be a/the issue. Here's what I've currently got for my "close" code:
if (localStream) {
if (peerConnection && peerConnection.removeStream) {
peerConnection.removeStream(localStream);
}
if (localStream.stop) {
localStream.stop();
}
localStream.onended = null;
localStream = null;
}
if (localElement) {
localElement.onerror = null;
localElement.pause();
localElement.src = undefined;
localElement = null;
}
if (remoteStream) {
if (peerConnection && peerConnection.removeStream) {
peerConnection.removeStream(remoteStream);
}
if(remoteStream.stop) {
remoteStream.stop();
}
remoteStream.onended = null;
remoteStream = null;
}
if (remoteElement) {
remoteElement.onerror = null;
remoteElement.pause();
remoteElement.src = undefined;
remoteElement = null;
}
if (peerConnection) {
peerConnection.close();
peerConnection = null;
}
I've tried with and without the removeStream() call, I've tried with and without the stop() call, I've tried the element.src="" and element.src=null, I'm running out of ideas. Anyone know if this is a bug or user/my error in the use of the API?
EDIT: I set my default device (using Windows) to a camera that has a light when it's in use, and upon stopping, the camera light goes off, so perhaps this is a Chrome bug. I also discovered that if I use chrome://settings/content to change the microphone device to anything other than "Default", Chrome audio fails altogether. And finally, I realized that using element.src=undefined resulted in Chrome attempting to load a resource and throwing a 404 so that's clearly not correct... so back to element.src='' on that.
Ended up being my fault (yes, shocking). Turns out I wasn't saving localStream correctly in the onUserMediaSuccess callback of getUserMedia... once that was set, Chrome is turning off the blinking recording icon. That didn't explain the other anomalies, but it closes the main point of the question.
I just got this working yesterday after trawling through the WebRTC specification. I don't know if this is the "right" way to do it, but I found that renegotiating the PeerConnection with a new offer after removing the stream did the trick.
var pc = peerConnections[socketId];
pc.removeStream(stream);
pc.createOffer( function(session_description) {
pc.setLocalDescription(session_description);
_socket.send(JSON.stringify({
"eventName": "send_offer",
"data":{
"socketId": socketId,
"sdp": session_description
}
}));
},
function(error) {},
defaultConstraints);

Categories

Resources