I used webRTC, node js, and react to build a fully functional video conferencing app that can support up to 4 users and uses mesh architecture. After that, I wanted to add a record meeting feature, so I added it. However, it only records my own audio from my microphone and remote stream audio is not recorded in the media recorder. Why is that?
here is a simple code snippet that shows how I get my tab screen stream
const toBeRecordedStream = await navigator.mediaDevices.getDisplayMedia({
video: {
width: 1920,
height: 1080,
frameRate: {
max:30,
ideal: 24,
},
},
audio: true,
});
After receiving the tab stream, I used audio context to combine the tab audio with my microphone audio and record it.
const vp9Codec = "video/webm;codecs=vp9,opus";
const vp9Options = {
mimeType: vp9Codec,
};
const audioCtx = new AudioContext();
const outputStream = new MediaStream();
const micStream = audioCtx.createMediaStreamSource(localStream);
const screenAudio = audioCtx.createMediaStreamSource(screenStream);
const destination = audioCtx.createMediaStreamDestination();
screenAudio.connect(destination);
micStream.connect(destination);
outputStream.addTrack(screenStream.getVideoTracks()[0]);
outputStream.addTrack(destination.stream.getAudioTracks()[0]);
if (MediaRecorder.isTypeSupported(vp9Codec)) {
mediaRecorder = new MediaRecorder(outputStream, vp9Options);
} else {
mediaRecorder = new MediaRecorder(outputStream);
}
mediaRecorder.ondataavailable = handelDataAvailable;
mediaRecorder.start();
Four video and audio streams are visible on the screen, but only my voice and the video of the tab are recorded
and I am working with the Chrome browser because I am aware that Firefox does not support tab audio, but Chrome and Edge do.
Related
I have an application that plays multiple web audio sources concurrently, and allows the user to record audio at the same time. It works fine if the physical input (e.g. webcam) cannot detect the physical output (e.g. headphones). But if the output can bleed into the input (e.g. using laptop speakers with a webcam), then the recording picks up the other audio sources.
My understanding is the echoCancellation constraint is supposed to address this problem, but it doesn't seem to work when multiple sources are involved.
I've included a simple example to reproduce the issue. JSfiddle seems to be too strictly sandboxed to allow user media otherwise I'd dump it somewhere.
Steps to reproduce
Press record
Make a noise, or just observe. The "metronome" should beep 5 times
After 2 seconds, the <audio> element source will be set to the recorded audio data
Play the <audio> element - you will hear the "metronome" beep. Ideally, the metronome beep would be "cancelled" via the echoCancellation constraint which is set on the MediaStream, but it doesn't work this way.
index.html
<!DOCTYPE html>
<html lang="en">
<body>
<button onclick="init()">record</button>
<audio id="audio" controls="true"></audio>
<script src="demo.js"></script>
</body>
</html>
demo.js
let audioContext
let stream
async function init() {
audioContext = new AudioContext()
stream = await navigator.mediaDevices.getUserMedia({
audio: {
echoCancellation: true,
},
video: false,
})
playMetronome()
record()
}
function playMetronome(i = 0) {
if (i > 4) {
return
}
const osc = new OscillatorNode(audioContext, {
frequency: 440,
type: 'sine',
})
osc.connect(audioContext.destination)
osc.start()
osc.stop(audioContext.currentTime + 0.1)
setTimeout(() => {
playMetronome(i + 1)
}, 500)
}
function record() {
const recorder = new MediaRecorder(stream)
const data = []
recorder.addEventListener('dataavailable', (e) => {
console.log({ event: 'dataavailable', e })
data.push(e.data)
})
recorder.addEventListener('stop', (e) => {
console.log({ event: 'stop', e })
const blob = new Blob(data, { type: 'audio/ogg; codecs=opus' })
const audioURL = window.URL.createObjectURL(blob)
document.getElementById('audio').src = audioURL
})
recorder.start()
setTimeout(() => {
recorder.stop()
}, 2000)
}
Unfortunately this is a long standing issue in Chrome (and all its derivatives). It should work in Firefox and Safari.
Here is the ticket: https://bugs.chromium.org/p/chromium/issues/detail?id=687574.
It basically says that the echo cancellation only works for audio that is coming from a peer connection. As soon as it is processed locally by the Web Audio API it will not be considered anymore by the echo cancellation.
Audio capture with getDisplayMedia is not worked with Chrome in my Macbook, and it is not asked to check the audio share when the chrome ask user to share the screen, it only recorded the video with the MediaStream. But In my Windows computer, it was fully supported with Chrome browser both with video and audio capture, and it will ask user to check to share the audio or not. May I please ask is that because of the supporting issue or it is the code problem? I am using the lastest version of chrome in macbook
Below is my code:
navigator.mediaDevices
.getDisplayMedia({
video: true,
audio: true
})
.then((Mediastream) => {
vm.$set(vm, 'isRecording', true);
if (vm.isInitiator || vm.isConnector) {
if (localStream) {
let localAudio = new MediaStream();
localAudio.addTrack(localStream.getAudioTracks()[0]);
if (Mediastream.getAudioTracks().length != 0) {
let systemAudio = new MediaStream();
systemAudio.addTrack(Mediastream.getAudioTracks()[0]);
let audioContext = new AudioContext();
let audioIn_01 = audioContext.createMediaStreamSource(localAudio);
let audioIn_02 = audioContext.createMediaStreamSource(systemAudio);
let dest = audioContext.createMediaStreamDestination();
audioIn_01.connect(dest);
audioIn_02.connect(dest);
let finalAudioStream = dest.stream;
Mediastream.removeTrack(Mediastream.getAudioTracks()[0]);
Mediastream.addTrack(finalAudioStream.getAudioTracks()[0]);
} else {
Mediastream.addTrack(localStream.getAudioTracks()[0]);
}
}
}
this.createRecorder(Mediastream);
})
.catch((err) => {
this.getUserMediaError(err);
});
Unfortunately this is a limitation of Chrome on macOS. According to "caniuse.com",
On Windows and Chrome OS the entire system audio can be captured, but on Linux and macOS only the audio of a tab can be captured.
https://caniuse.com/mdn-api_mediadevices_getdisplaymedia_audio_capture_support
I'm developing an app where users can capture photo using a front/rare camera. it working perfectly but when toggle over camera front/rare var playPromise = videoStream.play() is gone in pending state. some times promise get resolve, the camera is working sometimes not.
this issue occurs only in chrome browser not in mozila and firefox
try {
stopWebCam(); // stop media stream when toggle over camera
stream = await navigator.mediaDevices.getUserMedia({video: true});
/* use the stream */
let videoStream = document.getElementById('captureCandidateId');
videoStream.srcObject = stream;
// videoStream.play();
var playPromise = videoStream.play();
if (playPromise !== undefined) {
playPromise.then(_ => {
// Automatic playback started!
// Show playing UI.
})
.catch(error => {
// Auto-play was prevented
// Show paused UI.
});
}
};
} catch(err) {
/* handle the error */
console.log(err.name + ": " + err.message);
}
let stopWebCam = function (pictureType) {
setTimeout(()=>{
let videoStream = document.getElementById('captureCandidateId');
const stream = videoStream.srcObject;
if (stream && stream.getTracks) {
const tracks = stream.getTracks();
tracks.forEach(function(track) {
track.stop();
});
}
videoStream.srcObject = null;
}, 0)
}
Here, I drafted a piece of code for you, this is much more simple and smaller approach than what you are trying to do. I am just taking the stream from the video element and drawing it to canvas. Image can be downloaded by right clicking.
NOTE: Example does not work in StackOverflow
<video id="player" controls autoplay></video>
<button id="capture">Capture</button>
<canvas id="canvas" width=320 height=240></canvas>
<script>
const player = document.getElementById('player');
const canvas = document.getElementById('canvas');
const context = canvas.getContext('2d');
const captureButton = document.getElementById('capture');
const constraints = {
video: true,
};
captureButton.addEventListener('click', () => {
// Draw the video frame to the canvas.
context.drawImage(player, 0, 0, canvas.width, canvas.height);
});
// Attach the video stream to the video element and autoplay.
navigator.mediaDevices.getUserMedia(constraints)
.then((stream) => {
player.srcObject = stream;
});
</script>
If you want, you can also make some edits according to your needs, like:
Choose which camera to use
Hide the video stream
Add a easier method to download the photo on your device
You can also add a functionality to upload the photo straight to the server if you have one
If i use the following code to record a canvas animation:
streamInput = parent.document.getElementById('whiteboard');
stream = streamInput.captureStream();
const recorder = RecordRTC(stream, {
// audio, video, canvas, gif
type: 'video',
mimeType: 'video/webm',
recorderType: MediaStreamRecorder,
disableLogs: false,
timeSlice: 1000,
ondataavailable: function(blob) {},
onTimeStamp: function(timestamp) {},
bitsPerSecond: 3000000,
frameInterval: 90,
frameRate: 60,
bitrate: 3000000,
});
recorder.stopRecording(function() {
getSeekableBlob(recorder.getBlob(), function(seekableBlob) {
url = URL.createObjectURL(recorder.getBlob());
$("#exportedvideo").attr("src", url);
$("#exportedvideo").attr("controls", true);
$("#exportedvideo").attr("autoplay", true);
})
});
The video plays fine and i can seek it in chrome/edge/firefox etc.
When i download the video using the following code:
getSeekableBlob(recorder.getBlob(), function(seekableBlob) {
var file = new File([seekableBlob], "test.webm", {
type: 'video/webm'
});
invokeSaveAsDialog(file, file.name);
}
The video downloads and plays fine, and the seekbar updates like normal.
If i then move the seekbar to any position, as soon as I move it I get a media player message:
Can't play,
Can't play because the item's file format isnt supported. Check store to see if this item is available here.
0xc00d3e8c
If i use firefox and download the file, it plays perfect and I can seek.
Do i need to do anything else to fix the Chromium webm?
i've tried using the following code to download the file:
var file = new File([recorder.getBlob()], "test.webm", {
type: 'video/webm'
});
invokeSaveAsDialog(file, file.name);
however, the file plays and i can move the seekbar but the video screen is black.
yet firefox works fine.
Here are the outputted video files:
First set were created without ts-ebml intervention:
1: https://lnk-mi.app/uploads/chrome.webm
2: https://lnk-mi.app/uploads/firefox.webm
Second set were created using ts-ebml:
1: https://lnk-mi.app/uploads/chrome-ts-ebm.webm
2: https://lnk-mi.app/uploads/firefox-ts-ebml.webm
both were created exactly the same way using ts-ebml.js to write the meta-data
recorder.addEventListener("dataavailable", async(e) => {
try {
const makeMediaRecorderBlobSeekable = await injectMetadata(e.data);
data.push(await new Response(makeMediaRecorderBlobSeekable).arrayBuffer());
blobData = await new Blob(data, { type: supportedType });
} catch (e) {
console.error(e);
console.trace();
}
});
is there a step I am missing?
Having tried all the plugins like ts-ebml and web-writer, I found the only reliable solution was to upload the video to my server and use ffmpeg with the following command
ffmpeg -i {$srcFile} -c copy -crf 20 -f mp4 {$destFile}
to convert the video to mp4.
I'm playing HLS audio in Chrome Browser on Android using HTML 5 <audio>.
How can I get Chrome to keep playing a HLS stream on lock stream or when browser tab is inactive?
On iOS the stream will continue to play when the browser tab is changed or screen is locked, but it stops the audio on Android.
<audio controls preload="meta">
<source
src="http://qthttp.apple.com.edgesuite.net/1010qwoeiuryfg/sl.m3u8"
type="application/x-mpegURL"
/>
</audio>
I even added mediaSession info to try and help it along.
var audio = document.getElementById("audio");
if ("mediaSession" in navigator) {
audio.onplay = function() {
navigator.mediaSession.metadata = new MediaMetadata({
title: "TEST",
artist: "ARTIST"
});
};
}
Android will play hls streams natively, but if you want the behaviour where it plays on the lock screen use hls.js
let audio = document.getElementById("hls-audio");
let source = "http://qthttp.apple.com.edgesuite.net/1010qwoeiuryfg/sl.m3u8"
if (Hls.isSupported()) {
console.log("hello hls.js!");
let hls = new Hls();
hls.attachMedia(audio);
hls.on(Hls.Events.MEDIA_ATTACHED, () => {
console.log("video and hls.js are now bound together !");
hls.loadSource(source);
});
}
audio.onplay = () => {
console.log("we can track events the same way with hls.js")
};