There is somewhere very silly mistake in my code that I can't find. Basically what I'm doing is, I'm using two separate buttons to start and stop recording the stream that I get from WebRTC getUserMedia() (I'm using RecordRTC for recording). My stop function stops the recording but does not release the camera.
<script type="text/javascript">
$(document).ready(function () {
var recorder;
var video = document.getElementById("video");
var videoConstraints = {
video: {
mandatory: {
minWidth: 1280,
minHeight: 720,
maxWidth: 1920,
maxHeight: 1080,
minFrameRate: 29.97,
maxFrameRate: 60,
minAspectRatio: 1.77
}
},
audio: true
};
function captureCamera(callback) {
navigator.mediaDevices.getUserMedia(videoConstraints).then(function (camera) {
callback(camera);
}).catch(function (error) {
alert('Unable to capture your camera. Please check console logs.');
console.error(error);
});
}
function stopRecordingCallback() {
video.src = video.srcObject = null;
video.src = URL.createObjectURL(recorder.getBlob());
video.play();
//recorder.camera.stop(); //its the deprecated way
recorder.camera.getTracks().forEach(track => track.stop()); //modern way as per documentation
recorder.destroy();
recorder = null;
}
hasGetUserMedia() {
return (navigator.getUserMedia ||
navigator.webkitGetUserMedia ||
navigator.mozGetUserMedia ||
navigator.msGetUserMedia);
}
$('#startRecord').on("click", function () {
if (hasGetUserMedia()) {
/*----------------recording process start----------------*/
this.disabled = true;
captureCamera(function (camera) {
setSrcObject(camera, video);
video.play();
var options = {
recorderType: MediaStreamRecorder,
mimeType: 'video/webm\;codecs=h264',
audioBitsPerSecond: 128000,
videoBitsPerSecond: 2097152, // 2 mbps
};
recorder = RecordRTC(camera, options);
recorder.startRecording();
// release camera on stopRecording
recorder.camera = camera;
document.getElementById('stopRecord').disabled = false;
});
/*----------------recording process end----------------*/
}
else {
alert('getUserMedia() is not supported by your browser');
}
});
$('#stopRecord').on("click", function () {
this.disabled = true;
document.getElementById('startRecord').disabled = false;
recorder.stopRecording(stopRecordingCallback);
});
});
</script>
So I can't find the reason why the camera isn't released when the $('#stopRecord').on("click", function (){}) is called. Any help?
You can stop your stream's tracks, like this:
navigator.getUserMedia({audio: false, video: true},
function(stream) {
// can also use getAudioTracks() or getVideoTracks()
var track = stream.getTracks()[0]; // if only one media track
// ...
track.stop();
},
function(error){
console.log('getUserMedia() error', error);
});
So, in your case, I believe you can do something like this:
var track = recorder.camera.getTracks()[0]; // if only one media track
// ...
track.stop();
Related
Using the below code I am recoding audio in the browser but when I download the audio, I am getting
sampleRate : 16000khz
SampleBits: 16
Bitrate: 512kb/s.
I want to downgrade the bitrate from 512kb/s to 256kb/s. Any help will be appreciated.
const startRecording = () => {
regenerateImageButton.disabled = true;
let constraints = {
audio: true,
video: false,
};
recordButton.disabled = true;
stopButton.disabled = false;
pauseButton.disabled = false;
audioContext = new window.AudioContext({
sampleRate: 16000,
//bufferLen: 4096
});
console.log("sample rate: " + audioContext.sampleRate);
navigator.mediaDevices
.getUserMedia(constraints)
.then(function (stream) {
console.log("initializing Recorder.js ...");
gumStream = stream;
let input = audioContext.createMediaStreamSource(stream);
recorder = new window.Recorder(input, {
numChannels: 1,
sampleBits: 16, // 8 or 16
//bufferLen: 4096,
mimeType: "audio/wav",
});
recorder.record();
if (stoptime == true) {
stoptime = false;
timerCycle();
}
})
.catch(function (err) {
//enable the record button if getUserMedia() fails
recordButton.disabled = false;
stopButton.disabled = true;
pauseButton.disabled = true;
});
};
The following code is working fine on android mobile device, However on IOS (iPhone) on Chrome browser the following is not working as expected. I saw a lot of problems regarding this issue on IOS devices. Someone has any idea how to fix this issue?
customCamera.js:
this.Play = function (_this, stream) {
_this.stream = stream;
console.log(stream)
videoElement.srcObject = stream;
videoElement.onloadedmetadata = function () {
// console.log("mediadataloaded");
try{
videoElement.play();
window.videoElement = videoElement
animate();
}
catch(error) {
console.error("Failed to acquire camera feed: " + error.name);
alert("Failed to acquire camera feed: " + error.name);
}
};
};
async start() {
var _this = this;
(navigator.mediaDevices && navigator.mediaDevices.getUserMedia) || alert("No navigator.mediaDevices.getUserMedia exists.");
try {
const stream = await navigator.mediaDevices.getUserMedia({
// audio: false,
video: {
facingMode: _this.options.facingMode,
width: _this.options.width,
height: _this.options.height,
frameRate: _this.options.frameRate
}
});
this.currentStream = stream;
window.stream = stream
_this.Play(_this, stream);
} catch (err) {
if(err.name === "OverconstrainedError") {
alert("Device dont have a back camera !")
}
}
}
I use this "customCamera.js" in my main.js file on regular video tag:
main.js
import Camera from './customCamera.js';
const deviceCamera = new Camera(videoElement, {
onFrame: async () => {
facingMode: "environment"
});
deviceCamera.start();
In IOS 12, Chrome version 19.0.4515 and below doesn't support camera but works fine in Safari.
I am doing screen recording using RecordRTC.
How do I include my mic audio when recording?
My code below using Angular:
async startRecording() {
let mediaConstraints = {
video: {
},
audio: true
};
await this.mediaDevices.getDisplayMedia(mediaConstraints).then(this.successCallback.bind(this), this.errorCallback.bind(this));
}
successCallback(stream: MediaStream) {
this.recording = true;
var options = {
mimeType: 'video/webm', // or video/webm\;codecs=h264 or video/webm\;codecs=vp9
audioBitsPerSecond: 128000,
videoBitsPerSecond: 128000,
bitsPerSecond: 128000 // if this line is provided, skip above two
};
this.stream = stream;
this.recordRTC = RecordRTC(stream, options);
this.recordRTC.startRecording();
let video: HTMLVideoElement = this.rtcvideo.nativeElement;
video.src = window.URL.createObjectURL(stream);
this.toggleControls();
}
You need to attach an audio track to the stream
successCallback(stream){
//your other code here
//...
navigator.mediaDevices.getUserMedia({audio:true}).then(function(mic) {
stream.addTrack(mic.getTracks()[0]);
});
//
this.recordRTC = RecordRTC(stream, options);
this.recordRTC.startRecording();
}
This should be helpful. https://www.webrtc-experiment.com/RecordRTC/
i can't figure out, why this error come from. or what i missed out.
here my code:
function mediaDeviceInit(deviceId) {
// this for fast codding see w3c spec for audio
navigator.getUserMedia = navigator.getUserMedia || navigator.webkitGetUserMedia || navigator.mozGetUserMedia;
console.log('IpCodec : Get user permissions for Media Access.');
let audioConstraints = {};
// check for default value
if (deviceId) {
audioConstraints = {
audio: { deviceId: deviceId, echoCancellation: false, sampleRate: defaultSampleRate }, video: false
};
} else {
audioConstraints = { audio: { echoCancellation: false, sampleRate: defaultSampleRate }, video: false };
}
if (navigator.mediaDevices && navigator.mediaDevices.getUserMedia) {
navigator.mediaDevices.getUserMedia(audioConstraints)
.then(function (stream) {
//console.log(navigator.mediaDevices.getSupportedConstraints());
userMediaSuccess(stream);
})
.catch(function (error) {
userMediaError(error);
});
} else {
console.log('IpCodec : Browser Unsuported to getUserMedia.')
}
// enumerate all input audio device
function enumAudioInput() {
// somecode
}
// callback on success
function userMediaSuccess(stream) {
let audioSrc = audioMixer.audioContext.createMediaStreamSource(stream); // --> error here
// some init code
console.log('IpCodec : Media permission granted by user.');
if (!deviceId) {
enumAudioInput();
}
}
// callback on error
function userMediaError(error) {
console.log('IpCodec' + error);
}
}
with error like :
Connecting AudioNodes from AudioContexts with different sample-rate is currently not supported.
and this part audioMixer class who own AudioContext :
class AudioMixer {
constructor(type, sRate, latency) {
this.audioContext;
// parse all browser compability
window.AudioContext = window.AudioContext || window.webkitAudioContext || window.mozAudioContext;
console.log('IpCodec : Initialize audio mixer success.');
if (window.AudioContext) {
this.audioContext = new window.AudioContext({ sampleRate: sRate, latencyHint: latency });
//console.log(this.audioContext);
} else {}
}
}
I am reverse-engineering a project and am running into some perplexing problems. The project is in Meteor, which I like, but doesn't seem to follow Meteors conventions.
This is essentially a javascript file to allow users to take a selfie using the laptop devices camera. However, after taking the photo, the camera does not turn off.
After having tried a number of suggestions online, I am putting the question: how does one turn off the camera?
Thank you for your help!
Template.newSelfie.rendered = function(){
// Grab elements, create settings, etc.
var canvas = document.getElementById("canvas"),
context = canvas.getContext("2d"),
video = document.getElementById("video"),
videoObj = { "video": true },
errBack = function(error) {
console.log("Video capture error: ", error.code);
};
// Put video listeners into place
if(navigator.getUserMedia) { // Standard
navigator.getUserMedia(videoObj, function(stream) {
video.src = stream;
video.play();
}, errBack);
} else if(navigator.webkitGetUserMedia) { // WebKit-prefixed
navigator.webkitGetUserMedia(videoObj, function(stream){
video.src = window.webkitURL.createObjectURL(stream);
video.play();
}, errBack);
}
else if(navigator.mozGetUserMedia) { // Firefox-prefixed
navigator.mozGetUserMedia(videoObj, function(stream){
video.src = window.URL.createObjectURL(stream);
video.play();
}, errBack);
}
// Converts canvas to an image
function convertCanvasToImage(canvas) {
var image = new Image();
image.src = canvas.toDataURL("image/png");
return image.src;
}
$('#take-selfie').click(function() {
context.drawImage(video, 0, 0, 450, 350);
var selfieImg = convertCanvasToImage(canvas);
Posts.insert({
ownerId: Meteor.userId(),
userWallId: Meteor.userId(),
content: '<img src="'+selfieImg+'">',
postedOn: new Date()
}, function(err, res) {
console.log(err || res);
});
Selfies.insert({
ownerId: Meteor.userId(),
image: selfieImg,
postedOn: moment().format('MM/DD/YYYY hh:mm a'),
createdAt: moment().format('YYYY-MM-DD')
}, function(err, res) {
console.log(err || res);
if(err){
console.log(err);
} else {
Router.go('profileSelfies');
}
});
});
};
const video = document.querySelector('video');
// A video's MediaStream object is available through its srcObject attribute
const mediaStream = video.srcObject;
// Through the MediaStream, you can get the MediaStreamTracks with getTracks():
const tracks = mediaStream.getTracks();
// Tracks are returned as an array, so if you know you only have one, you can stop it with:
tracks[0].stop();
// Or stop all like so:
tracks.forEach(track => track.stop())
https://dev.to/morinoko/stopping-a-webcam-with-javascript-4297