How to stream video from my WebRTC to Facebook RTMP server directly? - javascript

I'm trying to develop a web application with WebRTC and I'm getting video from my webcam through WebRTC and I want to do live streaming on Facebook and YouTube with my browser I have searched python and node js libraries but I haven't find any library for that. I want to build an application like streamyard.com.
I have watched ffmpeg

You can do this using Pion WebRTC and ffmpeg!
I have created a demo here. If you have ffmpeg installed and the Go compiler this should just work!
This takes audio/video from the browser, and then constructs a webm in memory. It then passes this WebM to ffmpeg via a stdin pipe, which then is transcode and sent to Twitch!
There are a lot of optimizations we could make here (like taking H264 from the browser directly) but H264 isn't supported everywhere, so this just makes the sample easier to reason with.

Related

Is it possible to save video call via WebRTC or web socket

I am on my way of learning javascript by developing some cool stuff like video conferencing app. I don't have much understanding of this webRTC technology so I am wondering if it is also possible to save video call to the server which has been taken place on webRTC based video conferencing app ?
Yes, it is possible. You can record each stream to blobs and push them to your server with websocket. You can then convert the blobs to a webm file.
Demo: video https://webrtc.github.io/samples/src/content/getusermedia/record/

Streaming video in real-time using a gstreamer-rtsp server to a web page

I am trying to build an application that can consume a video source(could be from a webcam or an offline video) and stream it in real-time to a web-page. I have been successful in creating a rtsp stream using gstreamer, but I am unable to receive this stream on the web page without an intermediate step i.e. converting the stream to a playlist.m3u8 using hlssink or ffmpeg.
I want the stream to be directly consumed by the web-page. Also, Is using the vlc-plugin my only option?
Any help would be much appreciated.
RTSP is not going to work over browser because most browsers do not support direct RTP streaming. If for some reason HTTP adaptive streaming protocols like HLS are not satisfying your requirements (e.g. latency not low enough), you can try WebRTC which is among others built on top of secure RTP (SRTP). It has a probably more involved setup than an RTSP server but is nowadays supported by all major browsers. You can check out the webrtcbin element for a GStreamer implementation.
Don't think it's possible since RTSP is not supported by any browser directly, and plugins support was removed by most of the modern browsers.
So the only solution is do conversion from RTSP to some supported by browsers format.
Thanks for the comments! I was able to make this work using Gstreamer's WebRTC example from : https://gitlab.freedesktop.org/gstreamer/gst-examples/-/tree/master/webrtc.

Video streaming to browser on iOS

I have implemented video streaming from a Java server to a website using WebSockets and Media Source Extensions (JavaScript). This works fine for nearly every browser on several operating systems except iOS. I am aware of the fact that MSE is not supported on iOS (yet).
Is there any way to easily enalbe video streaming for iOS clients using the same (already existing) technology via web sockets?
I think of something similar to Media Source Extensions, so that I just have to reimplement the client side.
My workflow is:
Create a HTML5 video element and Media Source
Create a new web socket and request video data from the server
Transcode video using FFmpeg and stream the result to stdout
Send the binary video data in chunks to the client
Add the video binary data to the source buffer of the HTML5 <video> element which is linked to a MediaSource with a SourceBuffer.
Hoping for any advice.
If needed, you can use the <video> tag. Look under "Provide Alternate Sources", you can use a HTTP live stream.

Real-time streaming from Android camera to browser

We are working on an IP camera Android app that should stream the video took in real-time by the Android camera to a Web page served by the same app and accessed through WiFi only.
The app currently use a pseudo-streaming method (an image sent using HTTP with no-store), but it is not robust enough, so we need to change it for a better streaming method. We also need to support multicast (or at least an optimized "multi-unicast"), and if possible use an UDP protocol (or at least a low-latency TCP protocol).
We cannot use any intermediary server (so no Wowza or the like, unless it is also served by the app) or any browser plugin (so no VLC or the like, unless it is served by the app too). The main browser it is used on is Chromium.
We searched for and tried a lot of methods but none worked for us :
WebRTC sounds cool, but it uses an intermediary signaling server, it doesn't support multicast, and it is kind of heavy for what we want
RTSP with libstreaming sounds cool too, but no browser seems to implement it, and we couldn't find a Javascript library to do it.
RTMP works on most browsers, but we could'nt find a working Android library
Which streaming method would be best for our needs, and do you know Javascript and Android libraries implementing them ?
There is no way to stream multicast to a browser.

Using websocket to stream in video tag

I'm trying to stream a (WebM or MP4) video from Node.js to HTML5 using websockets (the websocket library is Socket.IO on both server and client). The browser in use is the latest version of Chrome (version 26.0.1410.64 m).
I saw here that it's possible to push a video stream in the video tag from a file using the MediaSource object.
My idea is to read chunks of data from the websocket instead of a file.
Can someone please post an example using websockets to accomplish that or explain me how to do it?
Thanks in advance.
In addition to the text (string) messages, the WebSocket API allows you to send
binary data, which is especially useful to implement binary protocols. Such binary
protocols can be standard Internet protocols typically layered on top of TCP, where the
payload can be either a Blob or an ArrayBuffer.
// Send a Blob
var blob = new Blob("blob contents");
ws.send(blob);
// Send an ArrayBuffer
var a = new Uint8Array([8,6,7,5,3,0,9]);
ws.send(a.buffer);
Blob objects are particularly useful when combined with the JavaScript File API
for sending and receiving files, mostly multimedia files, images, video, and audio.
Also i suggest to see WebRTC (Technology associated with WebSockets) Web Real-Time Communication (WebRTC) is another effort to enhance the communication capabilities of modern web browsers. WebRTC is peer-to-peer technology for the Web. The first applications for WebRTC are real-time voice and video chat. WebRTC is already a compelling new technology for media applications, and there are many available sample applications online that enable you to test this out with video and audio over the Web. Please check this link

Categories

Resources