Get ReadableStream from Webcam in Browser - javascript

I would like to get webcam input as a ReadableStream in the browser to pipe to a WritableStream. I have tried using the MediaRecorder API, but that stream is chunked into separate blobs while I would like one continuous stream. I'm thinking the solution might be to pipe the MediaRecorder chunks to a unified buffer and read from that as a continuous stream, but I'm not sure how to get that intermediate buffer working.
mediaRecorder = new MediaRecorder(stream, recorderOptions);
mediaRecorder.ondataavailable = handleDataAvailable;
mediaRecorder.start(1000);
async function handleDataAvailable(event) {
if (event.data.size > 0) {
const data: Blob = event.data;
// I think I need to pipe to an intermediate stream? Not sure how tho
data.stream().pipeTo(writable);
}
}

Currently we can't really access the raw data of the MediaStream, the closest we have for video is the MediaRecorder API but this will encode the data and works by chunks not as a stream.
However, there is a new MediaCapture Transform W3C group working on a MediaStreamTrackProcessor interface doing exactly what you want and which is already available in Chrome under the chrome://flags/#enable-experimental-web-platform-features flag.
When reading the resulting stream and depending on which kind of track you passed, you'll gain access to VideoFrames or AudioFrames which are being added by the new WebCodecs API.
if( window.MediaStreamTrackProcessor ) {
const track = getCanvasTrack();
const processor = new MediaStreamTrackProcessor( track );
const reader = processor.readable.getReader();
readChunk();
function readChunk() {
reader.read().then( ({ done, value }) => {
// value is a VideoFrame
// we can read the data in each of its planes into an ArrayBufferView
const channels = value.planes.map( (plane) => {
const arr = new Uint8Array(plane.length);
plane.readInto(arr);
return arr;
});
value.close(); // close the VideoFrame when we're done with it
log.textContent = "planes data (15 first values):\n" +
channels.map( (arr) => JSON.stringify( [...arr.subarray(0,15)] ) ).join("\n");
if( !done ) {
readChunk();
}
});
}
}
else {
console.error("your browser doesn't support this API yet");
}
function getCanvasTrack() {
// just some noise...
const canvas = document.getElementById("canvas");
const ctx = canvas.getContext("2d");
const img = new ImageData(300, 150);
const data = new Uint32Array(img.data.buffer);
const track = canvas.captureStream().getVideoTracks()[0];
anim();
return track;
function anim() {
for( let i=0; i<data.length;i++ ) {
data[i] = Math.random() * 0xFFFFFF + 0xFF000000;
}
ctx.putImageData(img, 0, 0);
if( track.readyState === "live" ) {
requestAnimationFrame(anim);
}
}
}
<pre id="log"></pre>
<p>
Source<br>
<canvas id="canvas"></canvas>
</p>

Related

How to save object with large binary data and other values?

I am currently trying to save an js object with some binary data and other values. The result should look something like this:
{
"value":"xyz",
"file1":"[FileContent]",
"file2":"[LargeFileContent]"
}
Till now I had no binary data so I saved everything in JSON. With the binary data I am starting to run into problems with large files (>1GB).
I tried this approach:
JSON.stringify or how to serialize binary data as base64 encoded JSON?
Which worked for smaller files with around 20MB. However if I am using these large files then the result of the FileReader is always an empty string.
The result would look like this:
{
"value":"xyz:,
"file1":"[FileContent]",
"file2":""
}
The code that is reading the blobs is pretty similar to the one in the other post:
const readFiles = async (measurements: FormData) => {
setFiles([]); //This is where the result is beeing stored
let promises: Array<Promise<string>> = [];
measurements.forEach((value) => {
let dataBlob = value as Blob;
console.log(dataBlob); //Everything is fine here
promises.push(
new Promise((resolve, reject) => {
const reader = new FileReader();
reader.readAsDataURL(dataBlob);
reader.onloadend = function () {
resolve(reader.result as string);
};
reader.onerror = function (error) {
reject(error);
};
})
);
});
let result = await Promise.all(promises);
console.log(result); //large file shows empty
setFiles(result);
};
Is there something else I can try?
Since you have to share the data with other computers, you will have to generate your own binary format.
Obviously you can make it as you wish, but given your simple case of just storing Blob objects with a JSON string, we can come up with a very simple schema where we first store some metadata about the Blobs we store, and then the JSON string where we replaced each Blob with an UUID.
This works because the limitation you hit is actually on the max length a string can be, and we can .slice() our binary file to read only part of it. Since we never read the binary data as string we're fine, the JSON will only hold a UUID in places where we had Blobs and it shouldn't grow too much.
Here is one such implementation I made quickly as a proof of concept:
/*
* Stores JSON data along with Blob objects in a binary file.
* Schema:
* 4 first bytes = # of blobs stored in the file
* next 4 * # of blobs = size of each Blob
* remaining = JSON string
*
*/
const hopefully_unique_id = "_blob_"; // <-- change that
function generateBinary(JSObject) {
let blobIndex = 0;
const blobsMap = new Map();
const JSONString = JSON.stringify(JSObject, (key, value) => {
if (value instanceof Blob) {
if (blobsMap.has(value)) {
return blobsMap.get(value);
}
blobsMap.set(value, hopefully_unique_id + (blobIndex++));
return hopefully_unique_id + blobIndex;
}
return value;
});
const blobsArr = [...blobsMap.keys()];
const data = [
new Uint32Array([blobsArr.length]),
...blobsArr.map((blob) => new Uint32Array([blob.size])),
...blobsArr,
JSONString
];
return new Blob(data);
}
async function readBinary(bin) {
const numberOfBlobs = new Uint32Array(await bin.slice(0, 4).arrayBuffer())[0];
let cursor = 4 * (numberOfBlobs + 1);
const blobSizes = new Uint32Array(await bin.slice(4, cursor).arrayBuffer())
const blobs = [];
for (let i = 0; i < numberOfBlobs; i++) {
const blobSize = blobSizes[i];
blobs.push(bin.slice(cursor, cursor += blobSize));
}
const pattern = new RegExp(`^${hopefully_unique_id}\\d+$`);
const JSObject = JSON.parse(
await bin.slice(cursor).text(),
(key, value) => {
if (typeof value !== "string" || !pattern.test(value)) {
return value;
}
const index = +value.replace(hopefully_unique_id, "") - 1;
return blobs[index];
}
);
return JSObject;
}
// demo usage
(async () => {
const obj = {
foo: "bar",
file1: new Blob(["Let's pretend I'm actually binary data"]),
// This one is 512MiB, which is bigger than the max string size in Chrome
// i.e it can't be stored in a JSON string in Chrome
file2: new Blob([Uint8Array.from({ length: 512*1024*1024 }, () => 255)]),
};
const bin = generateBinary(obj);
console.log("as binary", bin);
const back = await readBinary(bin);
console.log({back});
console.log("file1 read as text:", await back.file1.text());
})().catch(console.error);

Why is AudioBufferSourceNode not consistent

I am rendering a music visualizer in multiple chunks and am having a hard time getting one chunk to transition into the next one gracefully.
I am looking for a way to get frequency data based on a specific time or frame and have it return the same buffer deterministically.
const render = () => {
return new Promise((resolve, reject) => {
try {
if (audioCtxRef.current) {
const bufferSource: AudioBufferSourceNode = audioCtxRef.current.createBufferSource();
bufferSource.buffer = sourceRef.current.buffer;
bufferSource.connect(analyzerRef.current);
bufferSource.onended = () => {
analyzerRef.current.getByteFrequencyData(fftRef.current);
analyzerRef.current.getFloatTimeDomainData(tdRef.current);
// See screenshots for this log, you will notice they are never the same values
console.log({
frameData: fftRef.current
});
logger({
frame,
frameData: fftRef.current
});
// put on UI
drawCanvas(
{
fft: fftRef.current
},
canvasRef.current,
background,
type
);
// finished
bufferSource.disconnect();
resolve("");
};
bufferSource.start(0, Number((frame / 60).toFixed(2)), 1);
} else {
reject("AudioCtx is missing");
onReady("visualizer");
}
} catch (e) {
reject(e);
onReady("visualizer");
}
});
};
This is the analyzer data from the bufferSource
This is a new result of the same analyzer data from the bufferSource with different values even though the time is the same

I'm capturing screen by using media recorder and making video from blob but that video is not showing it's duration [duplicate]

I am in the process of replacing RecordRTC with the built in MediaRecorder for recording audio in Chrome. The recorded audio is then played in the program with audio api. I am having trouble getting the audio.duration property to work. It says
If the video (audio) is streamed and has no predefined length, "Inf" (Infinity) is returned.
With RecordRTC, I had to use ffmpeg_asm.js to convert the audio from wav to ogg. My guess is somewhere in the process RecordRTC sets the predefined audio length. Is there any way to set the predefined length using MediaRecorder?
This is a chrome bug.
FF does expose the duration of the recorded media, and if you do set the currentTimeof the recorded media to more than its actual duration, then the property is available in chrome...
var recorder,
chunks = [],
ctx = new AudioContext(),
aud = document.getElementById('aud');
function exportAudio() {
var blob = new Blob(chunks);
aud.src = URL.createObjectURL(new Blob(chunks));
aud.onloadedmetadata = function() {
// it should already be available here
log.textContent = ' duration: ' + aud.duration;
// handle chrome's bug
if (aud.duration === Infinity) {
// set it to bigger than the actual duration
aud.currentTime = 1e101;
aud.ontimeupdate = function() {
this.ontimeupdate = () => {
return;
}
log.textContent += ' after workaround: ' + aud.duration;
aud.currentTime = 0;
}
}
}
}
function getData() {
var request = new XMLHttpRequest();
request.open('GET', 'https://upload.wikimedia.org/wikipedia/commons/4/4b/011229beowulf_grendel.ogg', true);
request.responseType = 'arraybuffer';
request.onload = decodeAudio;
request.send();
}
function decodeAudio(evt) {
var audioData = this.response;
ctx.decodeAudioData(audioData, startRecording);
}
function startRecording(buffer) {
var source = ctx.createBufferSource();
source.buffer = buffer;
var dest = ctx.createMediaStreamDestination();
source.connect(dest);
recorder = new MediaRecorder(dest.stream);
recorder.ondataavailable = saveChunks;
recorder.onstop = exportAudio;
source.start(0);
recorder.start();
log.innerHTML = 'recording...'
// record only 5 seconds
setTimeout(function() {
recorder.stop();
}, 5000);
}
function saveChunks(evt) {
if (evt.data.size > 0) {
chunks.push(evt.data);
}
}
// we need user-activation
document.getElementById('button').onclick = function(evt){
getData();
this.remove();
}
<button id="button">start</button>
<audio id="aud" controls></audio><span id="log"></span>
So the advice here would be to star the bug report so that chromium's team takes some time to fix it, even if this workaround can do the trick...
Thanks to #Kaiido for identifying bug and offering the working fix.
I prepared an npm package called get-blob-duration that you can install to get a nice Promise-wrapped function to do the dirty work.
Usage is as follows:
// Returns Promise<Number>
getBlobDuration(blob).then(function(duration) {
console.log(duration + ' seconds');
});
Or ECMAScript 6:
// yada yada async
const duration = await getBlobDuration(blob)
console.log(duration + ' seconds')
A bug in Chrome, detected in 2016, but still open today (March 2019), is the root cause behind this behavior. Under certain scenarios audioElement.duration will return Infinity.
Chrome Bug information here and here
The following code provides a workaround to avoid the bug.
Usage : Create your audioElement, and call this function a single time, providing a reference of your audioElement. When the returned promise resolves, the audioElement.duration property should contain the right value. ( It also fixes the same problem with videoElements )
/**
* calculateMediaDuration()
* Force media element duration calculation.
* Returns a promise, that resolves when duration is calculated
**/
function calculateMediaDuration(media){
return new Promise( (resolve,reject)=>{
media.onloadedmetadata = function(){
// set the mediaElement.currentTime to a high value beyond its real duration
media.currentTime = Number.MAX_SAFE_INTEGER;
// listen to time position change
media.ontimeupdate = function(){
media.ontimeupdate = function(){};
// setting player currentTime back to 0 can be buggy too, set it first to .1 sec
media.currentTime = 0.1;
media.currentTime = 0;
// media.duration should now have its correct value, return it...
resolve(media.duration);
}
}
});
}
// USAGE EXAMPLE :
calculateMediaDuration( yourAudioElement ).then( ()=>{
console.log( yourAudioElement.duration )
});
Thanks #colxi for the actual solution, I've added some validation steps (As the solution was working fine but had problems with long audio files).
It took me like 4 hours to get it to work with long audio files turns out validation was the fix
function fixInfinity(media) {
return new Promise((resolve, reject) => {
//Wait for media to load metadata
media.onloadedmetadata = () => {
//Changes the current time to update ontimeupdate
media.currentTime = Number.MAX_SAFE_INTEGER;
//Check if its infinite NaN or undefined
if (ifNull(media)) {
media.ontimeupdate = () => {
//If it is not null resolve the promise and send the duration
if (!ifNull(media)) {
//If it is not null resolve the promise and send the duration
resolve(media.duration);
}
//Check if its infinite NaN or undefined //The second ontime update is a fallback if the first one fails
media.ontimeupdate = () => {
if (!ifNull(media)) {
resolve(media.duration);
}
};
};
} else {
//If media duration was never infinity return it
resolve(media.duration);
}
};
});
}
//Check if null
function ifNull(media) {
if (media.duration === Infinity || media.duration === NaN || media.duration === undefined) {
return true;
} else {
return false;
}
}
//USAGE EXAMPLE
//Get audio player on html
const AudioPlayer = document.getElementById('audio');
const getInfinity = async () => {
//Await for promise
await fixInfinity(AudioPlayer).then(val => {
//Reset audio current time
AudioPlayer.currentTime = 0;
//Log duration
console.log(val)
})
}
I wrapped the webm-duration-fix package to solve the webm length problem, which can be used in nodejs and web browsers to support video files over 2GB with not too much memory usage.
Usage is as follows:
import fixWebmDuration from 'webm-duration-fix';
const mimeType = 'video/webm\;codecs=vp9';
const blobSlice: BlobPart[] = [];
mediaRecorder = new MediaRecorder(stream, {
mimeType
});
mediaRecorder.ondataavailable = (event: BlobEvent) => {
blobSlice.push(event.data);
}
mediaRecorder.onstop = async () => {
// fix blob, support fix webm file larger than 2GB
const fixBlob = await fixWebmDuration(new Blob([...blobSlice], { type: mimeType }));
// to write locally, it is recommended to use fs.createWriteStream to reduce memory usage
const fileWriteStream = fs.createWriteStream(inputPath);
const blobReadstream = fixBlob.stream();
const blobReader = blobReadstream.getReader();
while (true) {
let { done, value } = await blobReader.read();
if (done) {
console.log('write done.');
fileWriteStream.close();
break;
}
fileWriteStream.write(value);
value = null;
}
blobSlice = [];
};
//If you want to modify the video file completely, you can use this package "webmFixDuration", Other methods are applied at the display level only on the video tag With this method, the complete video file is modified
webmFixDuration github example
mediaRecorder.onstop = async () => {
const duration = Date.now() - startTime;
const buggyBlob = new Blob(mediaParts, { type: 'video/webm' });
const fixedBlob = await webmFixDuration(buggyBlob, duration);
displayResult(fixedBlob);
};

Web audio API, problem using the panNode, the sounds only play once

I am trying to integrate some panning effects to some sounds in a small testing app. It works fine except for one important issue: each sound only plays once!
I have tried several ways to attempt to bypass that issue without any success. The thing is, I can't pinpoint where the problem comes from. Here is my code, and a little explanation bellow.
const audio = new Audio('audio/background.mp3');
const footstep = new Audio('audio/footstep1.mp3');
const bumpWall1 = new Audio(`audio/bump-wall1.mp3`);
const bumpWall2 = new Audio(`audio/bump-wall2.mp3`);
const bumpWall3 = new Audio(`audio/bump-wall3.mp3`);
const bumpWall4 = new Audio(`audio/bump-wall4.mp3`);
const bumpWallArray = [bumpWall1, bumpWall2, bumpWall3, bumpWall4];
audio.volume = 0.5;
function play(sound, dir) {
let audioContext = new AudioContext();
let pre = document.querySelector('pre');
let myScript = document.querySelector('script');
let source = audioContext.createMediaElementSource(sound);
let panNode = audioContext.createStereoPanner();
source.connect(panNode);
panNode.connect(audioContext.destination);
if (dir === "left") {
panNode.pan.value = -0.8
} else if (dir === "right") {
panNode.pan.value = 0.8;
} else {
panNode.pan.value = 0;
}
sound.play();
}
So basically, when you call the play() function it plays the sound either on the left, the right, or the middle. But each sound is only played once. For example, if the footstep was played one time, it is never played again if I call the play() function on it.
Can anyone help me with that?
In your developer console, you should have a message stating something along the lines of
Uncaught InvalidStateError: Failed to execute 'createMediaElementSource' on 'AudioContext': HTMLMediaElement already connected previously to a different MediaElementSourceNode.
(At least in Chrome,) you can't connect a MediaElement several times to a MediaElementSourceNode.
To avoid this, you would have to disconnect this MediaElement from the MediaElementSourceNode, but this isn't possible...
The best in your situation is probably to use directly AudioBuffers rather than HTMLAudioElements, moreover if you don't append them in the doc.
let audio;
const sel = document.getElementById( 'sel' );
// create a single AudioContext, these are not small objects
const audioContext = new AudioContext();
fetch( 'https://dl.dropboxusercontent.com/s/agepbh2agnduknz/camera.mp3' ).then( resp => resp.arrayBuffer() )
.then( buf => audioContext.decodeAudioData( buf ) )
.then( audioBuffer => audio = audioBuffer )
.then( () => sel.disabled = false )
.catch( console.error );
function play(sound, dir) {
let source = audioContext.createBufferSource();
source.buffer = sound;
let panNode = audioContext.createStereoPanner();
source.connect( panNode );
panNode.connect( audioContext.destination );
if (dir === "left") {
panNode.pan.value = -0.8
} else if (dir === "right") {
panNode.pan.value = 0.8;
} else {
panNode.pan.value = 0;
}
source.start( 0 );
}
sel.onchange = evt => play( audio, sel.value );
<select id="sel" disabled>
<option>left</option>
<option>center</option>
<option>right</option>
</select>

Using DataView with nodejs Buffer

I'm trying to read/write some binary data using the DataView object. It seems to work correctly when the buffer is initialized from a UInt8Array, however it if pass it a nodejs Buffer object the results seem to be off. Am I using the API incorrectly?
import { expect } from 'chai';
describe('read binary data', () => {
it('test buffer', () => {
let arr = Buffer.from([0x55,0xaa,0x55,0xaa,0x60,0x00,0x00,0x00,0xd4,0x03,0x00,0x00,0x1c,0xd0,0xbb,0xd3,0x00,0x00,0x00,0x00])
let read = new DataView(arr.buffer).getUint32(0, false);
expect(read).to.eq(0x55aa55aa);
})
it('test uint8array', () => {
let arr = new Uint8Array([0x55,0xaa,0x55,0xaa,0x60,0x00,0x00,0x00,0xd4,0x03,0x00,0x00,0x1c,0xd0,0xbb,0xd3,0x00,0x00,0x00,0x00])
let read = new DataView(arr.buffer).getUint32(0, false);
expect(read).to.eq(0x55aa55aa);
})
})
The one with the buffer fails with
AssertionError: expected 1768779887 to equal 1437226410
+ expected - actual
-1768779887
+1437226410
try use this buf.copy
const buf = fs.readFileSync(`...`);
const uint8arr = new Uint8Array(buf.byteLength);
buf.copy(uint8arr, 0, 0, buf.byteLength);
const v = new DataView(uint8arr.buffer);
Nodejs Buffer is just a view over underlying allocated buffer that can be a lot larger. This is how to get ArrayBuffer out of Buffer:
function getArrayBufferFromBuffer( buffer ) {
return buffer.buffer.slice( buffer.byteOffset, buffer.byteOffset + buffer.byteLength ) );
}
This helped (not sure if this is the most elegant):
const buff = Buffer.from(msgBody, 'base64');
let uint8Array = new Uint8Array(buff.length);
for(let counter=0;counter<buff.length;counter++) {
uint8Array[counter] = buff[counter];
//console.debug(`uint8Array[${counter}]=${uint8Array[counter]}`);
}
let dataview = new DataView(uint8Array.buffer, 0, uint8Array.length);

Categories

Resources