VP8 video rendering in Javascript - javascript

I'm currently working on a interactive web application in javascript that renders in realtime a video received on a webpage and lets you send keyboard inputs.
The fact is that I can only receive VP8 video streams (not webm, just raw VP8 video without the Matroska container). I've managed to decode the video from the client side using dixie decoder (https://github.com/dominikhlbg/vp8-webm-javascript-decoder/), but the problem is that it adds buffering or something, because there is a lag of almost 2 seconds between when I receive a stream and I render it. Is there a way I can decode the stream natively? That would speed it the performance.
I thought of adding a matroska container to the vp8 received stream and sending it to the video tag, but I don't know how to create such container.

Ok, after days trying to figure out how to solve this I finally found the bug, which it wasn't in the Dixie decoder, but the server that needed a flag to stop buffering the video.

Related

Can javascript MSE play segmented mp4 from the middle?

in my current project i have a video stream that ffmpeg encodes to a segmented mp4. that encoded data is piped into an application that sends that data to whomever connects to that application through a websocket. when a client connects i make sure to send the ftyp and the moov boxes first and then send the most recent segments recieved from ffmpeg.
on the client side i just pass all binary data from the websocket to MSE.
The problem i am facing is that this works if the client is connected from the very start and gets all the fragments that ffmpeg pipes out, but it does not work if the client connects in after ffmpeg sends its first few fragments.
My question is:
Is it possible for MSE to play a fragmented mp4 from the middle when it is also provided the init segments?
If it is possible then how would that need to be implemented?
if it isnt possible then what format would allow me to stream live video over a websocket?
Is it possible for MSE to play a fragmented mp4 from
the middle when it is also provided the init segments?
Yes, This is exactly what fragmented (segmented) mp4 was designed to do
If it is possible then how would that need to be implemented?
The way you describe your implementation is correct. Send init fragment, followed by most recent AV fragment. Which means you have a different problem or a bug in your implementation.

How to append multiple audio files to a currently playing audio using Javascript?

I have a program that plays songs from the server. To make it more efficient i split the audio file on server into segments and the send them to the client using Ajax as base64 encoded. The HTML5 native audio player plays the base64 audio segment but when playing the next audio segment, it pauses a little and then plays. The retrieved segments are stored in IndexedDB for quick access but still it causes a pause in the playback. How to make the program more efficient as well as fix the audio pause happening between switching segments.
Is there any other way of appending audio file to a currently playing audio source without any pause using Javascript?
The Media Source Extensions API can do that, but note that you are just reinventing Range requests and caching, which are exactly what browsers already do for fetching media, but they do it better since they don't add the overhead of base64 on it.
So the "other way" is to configure your server to accept Range requests, to serve your file the most basically as possible in a single file, and to let the browser do its job.

RTSP stream: Possible ways to display in browser with everchanging stream-timestamp-depending overlay image?

The scenario is as follows:
I receive RTSP stream
Every second I receive a bunch of additional data corresponding to the exact timestamp of the stream
The stream needs to be displayed in the web browser. For each new portion of data new overlay image needs to be shown, in the exactly correct timestamp of the stream.
The stream needs to be saved for the VOD functionality.
It seems RTSP is not natively supported even in HTML5. I'm thinking about the following implementation:
Receive RTSP and data on .NET server
Each second compose a 1 second chunk of the video
Save this chunk into the DB along with the coresponding data
Send this chunk and the data to the web browser
Use JS to play this chunk and draw an overlay
I currently don't know how to implement #2 and #5. Are there any (C#?) libraries to discretize the stream? And, are there any JS libraries that may play chunks we compose at #2? I'm a complete newbie when it comes to video processing, so any directions would be appreciated.

Is it possible to play this stream using HTML5/javascript?

Basically trying to play some live audio streams in an app I'm porting to the browser.
Stream example: http://kzzp-fm.akacast.akamaistream.net/7/877/19757/v1/auth.akacast.akamaistream.net/kzzp-fm/
I have tried HTML5 audio tag and jPlayer with no luck. I know next to nothing about streaming audio, however, when I examine the HTTP response header the specified content type is "audio/aacp" (not sure if that helps).
I'm hoping someone with more knowledge of audio formats could point me in the right direction here.
The problem isn't with AAC+ being playable, the issue is with decoding the streaming ACC wrapper called ADTS. The Audio Data Transport Stream [pdf] or "MP4-contained AAC streamed over HTTP using the SHOUTcast protocol" can be decoded and therefore played by only a couple media players (e.g., foobar2000, Winamp, and VLC).
I had the same issue while trying to work with the SHOUTcast API to get HTML5 Audio playback for all the listed stations. Unfortunately it doesn't look like there's anything that can be done from our perspective, only the browser vendors can decide to add support for ADTS decoding. It is a documented issue in Chrome/WebKit. There are 60+ people (including myself) following the issue, which is marked as "WontFix".

Live video broadcasting

I am going to develop a chat based application for mobile which allows video chat. I am using HTML5, javascript and PhoneGap. Using phoneGap, I am able to access mobile camera, capture a video, save the video and upload it in server. I have done it for android. But I need live broadcasting of the video. Is there any solution of that?
Note: It is not any android native app.
You didn't specify what facility you're currently using for the video capture. AFAIK, current WebView doesn't yet support WebRTC which is the w3 standard that will soon enable you to access the video frames in your HTML5 code. So I'm assuming you're using PhoneGap's navigator.device.capture.captureVideo facility.
On Android, captureVideo creates 3gp files. The problem with 3gp is that they cannot be streamed or played while capturing: the MOOV atom of the file is required for parsing the video frames in it, and it is written only after all frames in the file have been encoded. So you must stop the recording before you can make any use of the file.
Your best shot in HTML5 is to implement a loop that captures a short clip (3-5 seconds?) of video, then sends it to a server while the next chunk is being captured. The server will need to concatenate the clips to a single file that can be broadcast with a streaming server. This will add several seconds to the latency of the broadcast, and you are quite likely to suffer from lost frames at the point in the gap between two separate chunk captures. That might be sufficient for some use cases (security cameras, for example).
If your application is such that you cannot afford to lose frames, I see no other option but to implement the video capture and streaming in Java, as a PhoneGap Plugin.
See Spydroid http://code.google.com/p/spydroid-ipcamera/
It uses the solution with the special FileDescriptor you found. Basically they let the video encoder write a .mp4 with H.264 to the special file descriptor that calls your code on write. Then they strip off the MP4 header and turn the H.264 NALUs into RTP packets.

Categories

Resources