How to stream a raw h.264 video in jPlayer? - javascript

jplayer supports mp4. But, I have a server that streams a raw h.264 video. Is it possible to stream it directly on the client side using jPlayer? If yes, please tell me how I should do it.
If no, how do I put the video into an mp4 container?
Or, is there any other JS library or jQuery plugin that can be used to display the h.264 stream?

You be better off putting your video file into a container.
If your video stream is already recorded then mp4 is a good choice.
You can wrap your video stream using ffmpeg or may be mp4box.
For playback in the browser you can use the html video tag or your jPlayer.
If you are live streaming - wrap your stream into mpeg dash and use Dash JS for playback.

there is a sample how to generate MPEG-DASH content using open source tools like x264 or MP4Box: http://www.dash-player.com/blog/2014/11/mpeg-dash-content-generation-using-mp4box-and-x264/

Related

How do I play a stream of H.264 NAL units in a video tag with MSE?

The situation is pretty straight-forward; I am receiving a stream of NAL units via WebSockets. How do I feed them into an HTML5 video tag using MSE?
Research indicates that I should mux the data into a fragmented mp4, but I haven't found any specifics on how to accomplish that. Does anyone have specifics?
If you receive a stream data e.g. hls, nalu h.264...and so on, you can transform and mux that into a fragmented mp4. Setting HTML5 video tag combines with MSE like creating mediaSource, mediaSource.addSourceBuffer, sourceBuffer.appendBuffer. That will play video while fmp4 right feed into buffer.
You may check out https://github.com/ChihChengYang/wfs.js which demonstrates transmuxing NALu h.264 streams from websocket. That works directly on top of a standard HTML5 element and MSE.

manipulating binaryjs video with node-canvas

Is it possible to live change video stream with canvas-node?
A'm using binaryjs to send binary frames over web socket and I want to change video live in node-canvas.
This is the code a'm using Webcam Binary.JS Demo
and this is the library that I want to use to change video with canvas node-canvas
Is that possible? If that is not possible what is the best solution to combine multiple video streams (with nodejs) to one video?

HTML5 generating video from images

i'm wondering, since HTML and with javascript are mesmerizing together, if there is a solution in HTML5 to generate a video-file from many images?
For example, you're able to load a video into a canvas and make it appear as greyscaled video, by manipulating the canvas. However, I would like to know,
if there is somewhat a method to generate a video-file out of that greyscaled version. Would make sense, if you want to send the video via whatsapp etc.
Thank you
Here we go:
Article: http://techslides.com/convert-images-to-video-with-javascript
Demo: http://techslides.com/demos/image-video/create.html (select multiple images at once)
Code: [just view the source]
You can download .webm video file
#K3N answer mentions building an encoder. Luckily there is one - https://github.com/antimatter15/whammy - snippet from the article:
You need a video encoder and today I just happened to stumble on Whammy, a real time JavaScript WebM Encoder.
There are currently no built-in API to do video encoding in HTML5. There are work in progress though, to allow basic video and audio recording - but it's not available at this time (audio recording is available in FireFox - it is also limited to streams).
If you are OK with gif animation you can encode the frames as a gif using one of the encoders out there (see below).
For video - there has been attempts, more or less successful, (the project I had in mind does not seem to be available anymore) but there has been issues from one browser to another.
There is the option of building an encoder yourself low-level style, following video encoding and file format specifications. It's doable but it's not a small project.
In any case, encoding video is a pretty performance hungry task even for native compiled applications. Running such a task in the browser will be a even more slow process and probably not practical for many users (and mobile devices will suck on those batteries).
The better approach IMO (at the moment at least, until the aforementioned API becomes available), is to send images to server and have a server in the back handling encoding jobs, then send the result to client. This way you can use multi-threading, offload the client, use native compiled encoders such as ffmpeg, and the resulting video can be streamed back.
Some resources
MediaStream Recording API
Gif encoder 1
Gif encoder 2 (NodeJS)
HTML5 Video recording information and status
Realtime video encoder (NodeJS/ffmpeg)
libvpx (requires emscripten/asm.js)
Hi I have built it using the code provided by tech-slides.
Also I made a template application where you can take list of images and turn them into video format. You have to edit the code according to your own needs. It is only supported in chrome and YouTube though. So basically in whammy.js you turn the images into canvas in a JavaScript file then turn the canvas into video using whammy.js function. You need to set event listener and load the videos into video tag. Whammy.js only produce webp file. To turn it into mp4:
Load it in YouTube then download it using YouTube as mp4. Hope it helps.
Just a follow on from #michal's answer, whammy is no longer maintained, however there's a modern fork of whammy encoder at ts-whammy.
See this answer to get a data URL for an image
import tsWhammy from "ts-whammy/src/libs";
// images can from: canvas.toDataURL(type, encoderOptions)
const images = [
"data:image/webp;base64,UklGRkZg....",
"data:image/webp;base64,UklGRkZg....",
];
// Make a 5 second video
const blob = tsWhammy.fromImageArrayWithOptions(images, { duration: 5 });
console.log(blob.type, blob.size);

HTML5 Video tag format

So I'm trying to write a server to stream video to a client in html5/javascript, and I'd like to use the already existing framework of the video tag in html5 if that's possible. That being said, I can't find I good source for what the format of the stream is. I think it works using progressive downloading (this is how youtube works as well?), but I can't find what the header of any given progressive download packet should look like.
Can someone point me in the direction of some information about the actual format of the video tag stream? I'm also not 100% devoted to the idea of using the video tag, so if someone has a better alternative, that'd also be great!
You just need an HTTP server such as Apache or Nginx to serve the progressive download of MP4 video files.

Live video broadcasting

I am going to develop a chat based application for mobile which allows video chat. I am using HTML5, javascript and PhoneGap. Using phoneGap, I am able to access mobile camera, capture a video, save the video and upload it in server. I have done it for android. But I need live broadcasting of the video. Is there any solution of that?
Note: It is not any android native app.
You didn't specify what facility you're currently using for the video capture. AFAIK, current WebView doesn't yet support WebRTC which is the w3 standard that will soon enable you to access the video frames in your HTML5 code. So I'm assuming you're using PhoneGap's navigator.device.capture.captureVideo facility.
On Android, captureVideo creates 3gp files. The problem with 3gp is that they cannot be streamed or played while capturing: the MOOV atom of the file is required for parsing the video frames in it, and it is written only after all frames in the file have been encoded. So you must stop the recording before you can make any use of the file.
Your best shot in HTML5 is to implement a loop that captures a short clip (3-5 seconds?) of video, then sends it to a server while the next chunk is being captured. The server will need to concatenate the clips to a single file that can be broadcast with a streaming server. This will add several seconds to the latency of the broadcast, and you are quite likely to suffer from lost frames at the point in the gap between two separate chunk captures. That might be sufficient for some use cases (security cameras, for example).
If your application is such that you cannot afford to lose frames, I see no other option but to implement the video capture and streaming in Java, as a PhoneGap Plugin.
See Spydroid http://code.google.com/p/spydroid-ipcamera/
It uses the solution with the special FileDescriptor you found. Basically they let the video encoder write a .mp4 with H.264 to the special file descriptor that calls your code on write. Then they strip off the MP4 header and turn the H.264 NALUs into RTP packets.

Categories

Resources