Encoding raw h.264 data to browser via Dash - javascript

I have a live stream of raw h264 (no container) coming from a remote webcam. I wanna stream it live in browser using DASH. DASH requires creating mpd file (and segmentation). I found tools (such as mp4box) that accomplish that in static files, but i'm struggling to find a solution for live streams. any suggestions - preferably using node.js modules?
Threads i have checked:
mp4box - from one hand i saw this comment that states " You cannot feed MP4Box with some live content. You need to feed MP4Box -live with pre-segmented chunks." on the other hand there's a lot of people directing to this bitmovin tutorial which does implement a solution using mp4box. In the toturial they are using mp4box (which has a node.js api implementation) and x264 (which doesn't have node.js module? or is contained in ffmpeg/mp4box?)
ngnix - ngnix has a module that support streaming to DASH using rtmp. for exemple in this toturial. I prefer not to go this path - as mention i'm trying to do it all in node.js.
Although i read couple of posts with similar problem, I couldn't find a suitable solution. Help would be much appreciated!

The typical architecture is to send your live stream to a streaming server which will then do the heavy lifting to make the stream available to other devices, using streaming protocols such as HLS and DASH.
So the client devices connect to the server rather than to your browser.
This allows the video to be encoded and packaged to reach as many devices as possible with the server doing any transcoding necessary and potentially also creating different bit rate versions of your stream to allow for different network conditions, if you want to provide this level of service.
The typical structure is encoded stream (e.g. h.264 video), packaged into a container (e.g. mp4 fragmented) and delivered via a streaming protocol such as HLS or DASH.

Related

Can javascript MSE play segmented mp4 from the middle?

in my current project i have a video stream that ffmpeg encodes to a segmented mp4. that encoded data is piped into an application that sends that data to whomever connects to that application through a websocket. when a client connects i make sure to send the ftyp and the moov boxes first and then send the most recent segments recieved from ffmpeg.
on the client side i just pass all binary data from the websocket to MSE.
The problem i am facing is that this works if the client is connected from the very start and gets all the fragments that ffmpeg pipes out, but it does not work if the client connects in after ffmpeg sends its first few fragments.
My question is:
Is it possible for MSE to play a fragmented mp4 from the middle when it is also provided the init segments?
If it is possible then how would that need to be implemented?
if it isnt possible then what format would allow me to stream live video over a websocket?
Is it possible for MSE to play a fragmented mp4 from
the middle when it is also provided the init segments?
Yes, This is exactly what fragmented (segmented) mp4 was designed to do
If it is possible then how would that need to be implemented?
The way you describe your implementation is correct. Send init fragment, followed by most recent AV fragment. Which means you have a different problem or a bug in your implementation.

Encrypting the Video file in SD-CARD in react native

I was able to successfully encrypt and decrypt the videos using AES encryption. It worked good for smaller videos while for the bigger files it gave us the memory out/Overflow exception. Is there a better way to safe guard the video files where only my application can have the access to it.
I am using this library "node-forge"
How are Video streaming apps like Netflix and Amazon prime are securing the videos locally, which are accessible only through their apps. If they are decrypting the whole file, how is the process so fast?
I was just wondering if we can just corrupt the file and de-corrupt while converting to base-64 ?
EDIT:
This is a E-learning application where videos are accessed through SD Card securely. These Videos should be secured and can be played in only our app.
You need to design your security measures based on your requirements which is a very complex process and you need to consider a lot of details. In one hand you need to design a suitable protocol for your application, and in the other hand you should try to make it secure.
As suitability of design, for example, you need to consider how you are going to playback your video or how much disk/memory you have. In cases like Netflix which they playback video while downloading, they they probably use streaming modes of encryption algorithms. But As I said, without understanding complete design of your application, suggesting encryption methods is somehow unethical.
Update:
If a simple encryption is what you need, I suggest you to use a streaming method(like CTR). In this case, you can decrypt your content on-fly rather than completely decrypting your files first. But you need to feed this content into your player. This may be a little problem if you have not written your own player. I did this once by hooking file read/write APIs and did similar thing that you need, so it is possible.
As you have said that you have successfully encrypted the smaller video files with AES, i will suggest you to break down your all files into small chunks (for example 512kb/1Mb parts file1.part1 or even give custom names so your app only knows which is part 1,2 and so on) and then encrypt each. During the decryption process decrypt each one after other to get the whole file or if you can create custom player do this on-the-fly.

how to get audio metadata using MEAN.js application

I want to upload audio file (either .wav or .flac) only if it match certain Sample Rate ,Channel and bit rate. but I have few doubts in implementing this functionality?
1) is it possible with client side scripting like AngularJs?
2) If its not possible with AngularJs than its possible to get the metadata information first on server with Node.js and upload only if it matches criteria.?
Please let me know in case you need more information for the same.
Have you tried working with Web Audio API? You don't necessarily need AngularJS and it's supported in all major browsers (Web Audio API Browser support).

Live video broadcasting

I am going to develop a chat based application for mobile which allows video chat. I am using HTML5, javascript and PhoneGap. Using phoneGap, I am able to access mobile camera, capture a video, save the video and upload it in server. I have done it for android. But I need live broadcasting of the video. Is there any solution of that?
Note: It is not any android native app.
You didn't specify what facility you're currently using for the video capture. AFAIK, current WebView doesn't yet support WebRTC which is the w3 standard that will soon enable you to access the video frames in your HTML5 code. So I'm assuming you're using PhoneGap's navigator.device.capture.captureVideo facility.
On Android, captureVideo creates 3gp files. The problem with 3gp is that they cannot be streamed or played while capturing: the MOOV atom of the file is required for parsing the video frames in it, and it is written only after all frames in the file have been encoded. So you must stop the recording before you can make any use of the file.
Your best shot in HTML5 is to implement a loop that captures a short clip (3-5 seconds?) of video, then sends it to a server while the next chunk is being captured. The server will need to concatenate the clips to a single file that can be broadcast with a streaming server. This will add several seconds to the latency of the broadcast, and you are quite likely to suffer from lost frames at the point in the gap between two separate chunk captures. That might be sufficient for some use cases (security cameras, for example).
If your application is such that you cannot afford to lose frames, I see no other option but to implement the video capture and streaming in Java, as a PhoneGap Plugin.
See Spydroid http://code.google.com/p/spydroid-ipcamera/
It uses the solution with the special FileDescriptor you found. Basically they let the video encoder write a .mp4 with H.264 to the special file descriptor that calls your code on write. Then they strip off the MP4 header and turn the H.264 NALUs into RTP packets.

Possible to stream live video without using RTSP?

Is it possible to live stream video (& audio) without using the RTSP protocol? Today I tried out Adobe's Flash Media Server and the free alternative Red5. Both seemed like a bit of an overkill (plus had issues with Red5 not supporting AAC audio).
Basically I'm looking for a way to upload live video to my server so it can be viewed using jwplayer, and then stored so it can be viewed later. Does MP4 support live streaming? So that I can record it client-side then upload it for viewing?
I've been experimenting with uploading jpg images and using a HTML5 canvas to display them so it appears like a video.
Here's my code: (using only a few images)
http://jsfiddle.net/QM5EV/
There's several things wrong with it. For one, it's not efficient because it requires mass amounts of jpg's to be uploaded. And most importantly there's no audio.
What would be best to do? Is RTSP the only sensible choice? Thanks. :)
Live via HTTP servers is, for the most part, not an option. But there is "Apple Live Streaming" aka MPEG-TS, although that limits your clients to iOS devices. It uses a plain ol'web server. (This seems to be changing, increasingly desktop browsers are supporting MPEG-TS, but will probably take some time before it is common place.)
For online streaming, rtsp is the best solution. Other protocols such as RTMP ( http://en.wikipedia.org/wiki/Real_Time_Messaging_Protocol) but transmit to any multimedia content using RTSP.
Another thing is that you can make a specific streaming server accepts HTTP redirect requests. Thus, instead of URL's as rtsp://mydomain.com:554/myfile.mp4 can have URL's like http://mydomain.com/myfile.mp4
Regards!

Categories

Resources