Turn chrome.tabcapture.capture into an audio file? - javascript

I"m trying to use the chrome API https://developers.chrome.com/extensions/tabCapture.
How do I get a audio file out of it. For example, if I'm watching youtube and want to export whatever song I'm doing into an audio file MP3, wav, etc. I know there are some chrome extensions out there that does this but I want to know if there is an API.
Thanks!

Use the desktopCapture API with DesktopCaptureSourceType "audio"
Saving the stream may be more challenging. You could create a Web Audio API Audio context and call decodeAudioData() to get raw PCM data (e.g. .wav file). MP3 encoding you might be able to find some kind of Emscripten module for.
Alternatively, it looks like there is a media recorder API (demo) I'm not sure if this is stable in many browsers yet, but I just tried in Chrome and it seemed to work.

Related

Access browser audio output with getUserMedia()?

I'm trying to capture audio output from the browser and save a recorded file in JavaScript (without using 3rd party apps or browser extensions).
After reviewing the examples at WebRTC samples this task seems to be relatively straightforward when capturing audio from a user's microphone using the MediaStream output of getUserMedia().
Is there a way to capture a MediaStream that is just the browser's audio output? Or is there some better way to access the browser's audio output in a way that can be recorded to a file?
For context, my audio output in the browser may originate from one of several audio libraries (Tone.js, for example) so I'd rather not rely on generating the audio file from the JS library that is generating the audio. I've looked into writing a file from the AudioContext, but I am trying to find some solution that would be audio-source-agnostic.

Javascript Video Blob URL and MPEG-Dash

I am switching our video player over from normal video sources over to chunked progressive video streaming using mpeg-dash. When using mpeg-dash rather than linking to an actual video source you link to a mpeg-dash manifest file which has all the information about each chunk and allows your player to swap in and out chunks as bandwidth changes. Now all of that seems pretty straight forward, however I am also working on setting up blob urls to obscure our source file location and I am running into issues. How does this work since all the documentation I can find on blob responses are either entirely in JS or return an XMLHttpRequest.
You can see an example here within the dash.js documentation that does exactly what I want http://mediapm.edgesuite.net/dash/public/nightly/samples/dash-if-reference-player/index.html if you inspect the element it has a blob url and loads in chunks but I cannot find any docs on how to do this.
So essentially my question is, how can you get a mpeg-dash manifest file to work in conjunction with the blob url system to obscure source URLs.
I am also working on setting up blob urls to obscure our source file location
I assure you that you're not usefully obscuring anything. The data has to come from somewhere. It's trivial to determine from where no matter how you think you're obscuring it client-side.
Don't bother with this.
How does this work since all the documentation I can find on blob responses are either entirely in JS or return an XMLHttpRequest.
The reason you see blob with web-based DASH players is that they're using MediaSource Extensions (MSE) to get the data in the first place. The video player effectively has a blob source which is managed by the browser. Your JavaScript downloads the chunks and sends them off to the browser to be ran through the codec and output to the video element.
There is a decent MSE example on MDN: https://developer.mozilla.org/en-US/docs/Web/API/MediaSource

HTML5 generating video from images

i'm wondering, since HTML and with javascript are mesmerizing together, if there is a solution in HTML5 to generate a video-file from many images?
For example, you're able to load a video into a canvas and make it appear as greyscaled video, by manipulating the canvas. However, I would like to know,
if there is somewhat a method to generate a video-file out of that greyscaled version. Would make sense, if you want to send the video via whatsapp etc.
Thank you
Here we go:
Article: http://techslides.com/convert-images-to-video-with-javascript
Demo: http://techslides.com/demos/image-video/create.html (select multiple images at once)
Code: [just view the source]
You can download .webm video file
#K3N answer mentions building an encoder. Luckily there is one - https://github.com/antimatter15/whammy - snippet from the article:
You need a video encoder and today I just happened to stumble on Whammy, a real time JavaScript WebM Encoder.
There are currently no built-in API to do video encoding in HTML5. There are work in progress though, to allow basic video and audio recording - but it's not available at this time (audio recording is available in FireFox - it is also limited to streams).
If you are OK with gif animation you can encode the frames as a gif using one of the encoders out there (see below).
For video - there has been attempts, more or less successful, (the project I had in mind does not seem to be available anymore) but there has been issues from one browser to another.
There is the option of building an encoder yourself low-level style, following video encoding and file format specifications. It's doable but it's not a small project.
In any case, encoding video is a pretty performance hungry task even for native compiled applications. Running such a task in the browser will be a even more slow process and probably not practical for many users (and mobile devices will suck on those batteries).
The better approach IMO (at the moment at least, until the aforementioned API becomes available), is to send images to server and have a server in the back handling encoding jobs, then send the result to client. This way you can use multi-threading, offload the client, use native compiled encoders such as ffmpeg, and the resulting video can be streamed back.
Some resources
MediaStream Recording API
Gif encoder 1
Gif encoder 2 (NodeJS)
HTML5 Video recording information and status
Realtime video encoder (NodeJS/ffmpeg)
libvpx (requires emscripten/asm.js)
Hi I have built it using the code provided by tech-slides.
Also I made a template application where you can take list of images and turn them into video format. You have to edit the code according to your own needs. It is only supported in chrome and YouTube though. So basically in whammy.js you turn the images into canvas in a JavaScript file then turn the canvas into video using whammy.js function. You need to set event listener and load the videos into video tag. Whammy.js only produce webp file. To turn it into mp4:
Load it in YouTube then download it using YouTube as mp4. Hope it helps.
Just a follow on from #michal's answer, whammy is no longer maintained, however there's a modern fork of whammy encoder at ts-whammy.
See this answer to get a data URL for an image
import tsWhammy from "ts-whammy/src/libs";
// images can from: canvas.toDataURL(type, encoderOptions)
const images = [
"data:image/webp;base64,UklGRkZg....",
"data:image/webp;base64,UklGRkZg....",
];
// Make a 5 second video
const blob = tsWhammy.fromImageArrayWithOptions(images, { duration: 5 });
console.log(blob.type, blob.size);

Is it possible to play this stream using HTML5/javascript?

Basically trying to play some live audio streams in an app I'm porting to the browser.
Stream example: http://kzzp-fm.akacast.akamaistream.net/7/877/19757/v1/auth.akacast.akamaistream.net/kzzp-fm/
I have tried HTML5 audio tag and jPlayer with no luck. I know next to nothing about streaming audio, however, when I examine the HTTP response header the specified content type is "audio/aacp" (not sure if that helps).
I'm hoping someone with more knowledge of audio formats could point me in the right direction here.
The problem isn't with AAC+ being playable, the issue is with decoding the streaming ACC wrapper called ADTS. The Audio Data Transport Stream [pdf] or "MP4-contained AAC streamed over HTTP using the SHOUTcast protocol" can be decoded and therefore played by only a couple media players (e.g., foobar2000, Winamp, and VLC).
I had the same issue while trying to work with the SHOUTcast API to get HTML5 Audio playback for all the listed stations. Unfortunately it doesn't look like there's anything that can be done from our perspective, only the browser vendors can decide to add support for ADTS decoding. It is a documented issue in Chrome/WebKit. There are 60+ people (including myself) following the issue, which is marked as "WontFix".

is there any way to stream binary content with javascript

I am hoping to make an example using the developer build of chrome and being able to use subsonic to stream a binary audio file. So far I have not had any luck though.
Granted my next option will be try to load in the audio files into windowStorage and toss some magic dust on them.
Does anybody know a way to stream a audio file to the audio tag through binary?

Categories

Resources