Streaming video from DirectX to HTML5 - javascript

I have a dynamic animated DirectX scene showing on screen. I want to stream that sequence of images to the browser, for an HTML5 client.
I'm currently encoding each frame as jpeg/base64 and sending it over a websocket. In the browser, I'm replacing the img's source to that encoded frame and the image gets updated.
Local desktop browsers get real-time changes, but performance is not great on Android devices. I'm not sure if that's due to frequent DOM updates or Websocket latency, but the image is laggy.
I thought of optimizing it by creating a video stream to load in the browser, so that I'll get performance boost from 1. hardware video decoding; 2. avoiding DOM manipulation and 3. avoiding websocket overhead.
I'm not sure if that's the way to go, and how to implement it (video novice). Can anyone help?

lol - I fittled with this issue once and it took me ages to understand the following:
If you need streaming, do streaming but do not fittle with images for following reasons:
browsers are allowed to load as many images at the same time as they want. While this serves html-pages well you can end up with extremely desynchronized image sequences if you try to simulate streaming.
videos gain factor 2-100 compression by building images on each other which converts into: you need lots and lots of more bandwidth if you simulate streaming based on images
html is not optimized for streaming which add lots of extra code, if the user should get even close to any streaming experience
most of what you need for streaming is already there (such as protocols are supported by browsers, codecs are installed on clients, browsers deal with all sort of plugins that transmit data to codecs and to make use of directx and open gl)
Problem is to get the whole story is something to be covered by a book which would overshoot the format of this post - however here are some starting points:
google "source forge video streaming server" to find open source streaming servers
-> your task now is to find any that satisfies your need and is compatible to your skills
look at flash with their actionscript as alternative:
-> http://en.wikipedia.org/wiki/ActionScript
-> http://www.m2osw.com/sswf
-> http://www.opensourcealternative.org/alternatives/graphic-editors/open-source-alternative-to-flash-professional/

If you already have each frame encoded as jpeg/base64 you can easily create an mp4 using the free ffmpeg tool. All moderm browsers have support for h264, and most of them if not all, with hardware acceleration, so you can start there:
If your images are written like:
image001.jpeg, image002.jpeg, ...
To create an mp4 from the images:
ffmpeg \
-r 1/10 \ <-- A different image each 10 seconds
-i image%03d.jpeg \
-c:v libx264 \
-r 25 \ <-- 25 images per second
-pix_fmt yuv420p \
newVideo.mp4
Notice that ffmpeg also let you capture directly from the screen or framebuffer, a pipe or many other sources as well as encoding to different bitrates, screensizes, ... apply filters -for example mixing two videos-, add audio tracks, ..

Related

Encoding raw h.264 data to browser via Dash

I have a live stream of raw h264 (no container) coming from a remote webcam. I wanna stream it live in browser using DASH. DASH requires creating mpd file (and segmentation). I found tools (such as mp4box) that accomplish that in static files, but i'm struggling to find a solution for live streams. any suggestions - preferably using node.js modules?
Threads i have checked:
mp4box - from one hand i saw this comment that states " You cannot feed MP4Box with some live content. You need to feed MP4Box -live with pre-segmented chunks." on the other hand there's a lot of people directing to this bitmovin tutorial which does implement a solution using mp4box. In the toturial they are using mp4box (which has a node.js api implementation) and x264 (which doesn't have node.js module? or is contained in ffmpeg/mp4box?)
ngnix - ngnix has a module that support streaming to DASH using rtmp. for exemple in this toturial. I prefer not to go this path - as mention i'm trying to do it all in node.js.
Although i read couple of posts with similar problem, I couldn't find a suitable solution. Help would be much appreciated!
The typical architecture is to send your live stream to a streaming server which will then do the heavy lifting to make the stream available to other devices, using streaming protocols such as HLS and DASH.
So the client devices connect to the server rather than to your browser.
This allows the video to be encoded and packaged to reach as many devices as possible with the server doing any transcoding necessary and potentially also creating different bit rate versions of your stream to allow for different network conditions, if you want to provide this level of service.
The typical structure is encoded stream (e.g. h.264 video), packaged into a container (e.g. mp4 fragmented) and delivered via a streaming protocol such as HLS or DASH.

HTML5 generating video from images

i'm wondering, since HTML and with javascript are mesmerizing together, if there is a solution in HTML5 to generate a video-file from many images?
For example, you're able to load a video into a canvas and make it appear as greyscaled video, by manipulating the canvas. However, I would like to know,
if there is somewhat a method to generate a video-file out of that greyscaled version. Would make sense, if you want to send the video via whatsapp etc.
Thank you
Here we go:
Article: http://techslides.com/convert-images-to-video-with-javascript
Demo: http://techslides.com/demos/image-video/create.html (select multiple images at once)
Code: [just view the source]
You can download .webm video file
#K3N answer mentions building an encoder. Luckily there is one - https://github.com/antimatter15/whammy - snippet from the article:
You need a video encoder and today I just happened to stumble on Whammy, a real time JavaScript WebM Encoder.
There are currently no built-in API to do video encoding in HTML5. There are work in progress though, to allow basic video and audio recording - but it's not available at this time (audio recording is available in FireFox - it is also limited to streams).
If you are OK with gif animation you can encode the frames as a gif using one of the encoders out there (see below).
For video - there has been attempts, more or less successful, (the project I had in mind does not seem to be available anymore) but there has been issues from one browser to another.
There is the option of building an encoder yourself low-level style, following video encoding and file format specifications. It's doable but it's not a small project.
In any case, encoding video is a pretty performance hungry task even for native compiled applications. Running such a task in the browser will be a even more slow process and probably not practical for many users (and mobile devices will suck on those batteries).
The better approach IMO (at the moment at least, until the aforementioned API becomes available), is to send images to server and have a server in the back handling encoding jobs, then send the result to client. This way you can use multi-threading, offload the client, use native compiled encoders such as ffmpeg, and the resulting video can be streamed back.
Some resources
MediaStream Recording API
Gif encoder 1
Gif encoder 2 (NodeJS)
HTML5 Video recording information and status
Realtime video encoder (NodeJS/ffmpeg)
libvpx (requires emscripten/asm.js)
Hi I have built it using the code provided by tech-slides.
Also I made a template application where you can take list of images and turn them into video format. You have to edit the code according to your own needs. It is only supported in chrome and YouTube though. So basically in whammy.js you turn the images into canvas in a JavaScript file then turn the canvas into video using whammy.js function. You need to set event listener and load the videos into video tag. Whammy.js only produce webp file. To turn it into mp4:
Load it in YouTube then download it using YouTube as mp4. Hope it helps.
Just a follow on from #michal's answer, whammy is no longer maintained, however there's a modern fork of whammy encoder at ts-whammy.
See this answer to get a data URL for an image
import tsWhammy from "ts-whammy/src/libs";
// images can from: canvas.toDataURL(type, encoderOptions)
const images = [
"data:image/webp;base64,UklGRkZg....",
"data:image/webp;base64,UklGRkZg....",
];
// Make a 5 second video
const blob = tsWhammy.fromImageArrayWithOptions(images, { duration: 5 });
console.log(blob.type, blob.size);

Live video broadcasting

I am going to develop a chat based application for mobile which allows video chat. I am using HTML5, javascript and PhoneGap. Using phoneGap, I am able to access mobile camera, capture a video, save the video and upload it in server. I have done it for android. But I need live broadcasting of the video. Is there any solution of that?
Note: It is not any android native app.
You didn't specify what facility you're currently using for the video capture. AFAIK, current WebView doesn't yet support WebRTC which is the w3 standard that will soon enable you to access the video frames in your HTML5 code. So I'm assuming you're using PhoneGap's navigator.device.capture.captureVideo facility.
On Android, captureVideo creates 3gp files. The problem with 3gp is that they cannot be streamed or played while capturing: the MOOV atom of the file is required for parsing the video frames in it, and it is written only after all frames in the file have been encoded. So you must stop the recording before you can make any use of the file.
Your best shot in HTML5 is to implement a loop that captures a short clip (3-5 seconds?) of video, then sends it to a server while the next chunk is being captured. The server will need to concatenate the clips to a single file that can be broadcast with a streaming server. This will add several seconds to the latency of the broadcast, and you are quite likely to suffer from lost frames at the point in the gap between two separate chunk captures. That might be sufficient for some use cases (security cameras, for example).
If your application is such that you cannot afford to lose frames, I see no other option but to implement the video capture and streaming in Java, as a PhoneGap Plugin.
See Spydroid http://code.google.com/p/spydroid-ipcamera/
It uses the solution with the special FileDescriptor you found. Basically they let the video encoder write a .mp4 with H.264 to the special file descriptor that calls your code on write. Then they strip off the MP4 header and turn the H.264 NALUs into RTP packets.

HTML5 Video: Detecting Bandwidth

I have a 1080p video that I'm displaying in an HTML5 <video> tag on my page.
Is there a simple(ish) javascript method of detecting bandwidth so I can switch out the video for lower quality versions if the user's connection is too slow to stream the video? Similar to the logic behind YouTube's 'auto' video size chooser.
Depending on what player and encoding platform you are using you may be able to use HLS encoding for your videos. HLS stands for HTTP Live Streaming, a protocol developed by Apple for primarily solving this problem (among others).
HLS basically breaks your video file into multiple small files so they can be "pseudo" streamed using a simple Web server. With HLS you can also encode in multiple resolutions and a player might be able to switch to a lower or higher bandwidth.
The only downside is that most of the players use Flash to play HLS encoded content. Check it out in action here: http://www.flashls.org/latest/examples/chromeless/
Here's HLS demo for flowplayer:
http://demos.flowplayer.org/basics/hls.html
And here is a plugin for VideoJS:
https://github.com/videojs/videojs-contrib-hls
To encode in HLS, you can either use ffmpeg for free and upload files to your server:
https://www.ffmpeg.org/ffmpeg-formats.html#hls-1
Or, you can use a cloud-based solution like AWS Transcoder or Brightcove
https://aws.amazon.com/elastictranscoder/
In google chrome at least there are these properties on a video element:
webkitVideoDecodedByteCount: 0
webkitAudioDecodedByteCount: 0
These should be enough to determine how fast the client can decode the video. As the video plays you would keep track of the delta amount of these bytes which gives you bytes/s the client is processing the video.
For a more up to date answer: MPEG-DASH is in the process of replacing HLS. HLS is used mainly in iOS land. Most desktop browsers do not plan to support it, and DASH is the standard they are moving toward. (However, there are plenty of players designed to allow you to use HLS with HTML5 video player like hls.js). DASH players include Bitmovin, Google Shaka, and more. Many people encode for both HLS and DASH currently. HLS also supports fragmented mp4. Please note that you will need to encode your videos correctly server side. Additional resource: http://www.streamingmedia.com/Articles/Articles/Editorial/Featured-Articles/The-State-of-MPEG-DASH-2016-110099.aspx
I hate that feature! It's usually wrong, and if I want to wait 2 hours to load my dang video, than wait I shall! There's no reliable way to accurately measure this without sending a large dummy file to the user and measure the time it took to get to him.
You should count on user input (and remember it correctly! Also unlike YouTube!).
In short, don't take YouTube as an example.
There are paid services that may give you an indication of what bandwidth the other party is on, like netspeed.
The data accuracy may be enough for you, but I haven't had the chance to test it for myself.

Possible to stream live video without using RTSP?

Is it possible to live stream video (& audio) without using the RTSP protocol? Today I tried out Adobe's Flash Media Server and the free alternative Red5. Both seemed like a bit of an overkill (plus had issues with Red5 not supporting AAC audio).
Basically I'm looking for a way to upload live video to my server so it can be viewed using jwplayer, and then stored so it can be viewed later. Does MP4 support live streaming? So that I can record it client-side then upload it for viewing?
I've been experimenting with uploading jpg images and using a HTML5 canvas to display them so it appears like a video.
Here's my code: (using only a few images)
http://jsfiddle.net/QM5EV/
There's several things wrong with it. For one, it's not efficient because it requires mass amounts of jpg's to be uploaded. And most importantly there's no audio.
What would be best to do? Is RTSP the only sensible choice? Thanks. :)
Live via HTTP servers is, for the most part, not an option. But there is "Apple Live Streaming" aka MPEG-TS, although that limits your clients to iOS devices. It uses a plain ol'web server. (This seems to be changing, increasingly desktop browsers are supporting MPEG-TS, but will probably take some time before it is common place.)
For online streaming, rtsp is the best solution. Other protocols such as RTMP ( http://en.wikipedia.org/wiki/Real_Time_Messaging_Protocol) but transmit to any multimedia content using RTSP.
Another thing is that you can make a specific streaming server accepts HTTP redirect requests. Thus, instead of URL's as rtsp://mydomain.com:554/myfile.mp4 can have URL's like http://mydomain.com/myfile.mp4
Regards!

Categories

Resources