WebRTC video streaming through a server - javascript

I wanna run the stream from client side then join from server to client
. How can I stream the video through a server to another Viewers? Is this possible?

I would like to try and point you in the right direction.
First, lets understand a little more about how WebRTC works.
In WebRTC you have a websocket that is called the bridge, the bridge's role is to help broker a connection between two or more peers.
Generaly speaking, the bridge uses STUN/TURN servers along with SDP Protocol to help establish the connections between peers.
STUN servers are used to establish p2p udp conenctions by punch holes through NAT.
If the STUN fails to punch a whole (ie there is a firewall), a TURN server is used as a hub & spoke (ie relays data though the TURN server).
The full WebRTC stack includes video/audio streaming with vp8/vp9/h264 codecs & data is packaged using RTP.
Lucky for you there is a node-js library that implments almost the entire stack.
https://github.com/js-platform/node-webrtc
The library essentially provides you a WebRTC data channel.
There is no support for "Media Streams" and thus I assume you need to build the encoding/decoding and RTP packaging yourself.
However, there is a discussion here on how to stream audio/video with the data channel:
https://github.com/js-platform/node-webrtc/issues/156
Now, your specific question, how to stream from a "server"?
Well WebRTC is generally p2p, however you could setup a "Server Peer" and designate it as having a source channel only (ie there is no input channel).
This peer then becomes the "server" and all the other peers can view its contents when they connect.
Hope that helps.
Cheers!

Related

WebRTC fails to connect P2P even though peers can send UDP packets to each other

I was under the impression that WebRTC goes to great lengths to achieve P2P connectivity despite NATs. [1][2] That's why I was surprised to learn that WebRTC fails to connect peers in some situations where a P2P connection is easy to achieve. I would like to understand why this happens, and if there is anything I can do to improve the situation.
Why do I claim that P2P is easy to achieve in some of these situations? I claim this because I have set up an experiment with 2 devices on different networks:
Device F is connected to the internet behind Full-cone NAT
Device S is connected to the internet behind Symmetric NAT
I can easily achieve a P2P connection between these devices in the following manner:
Device F binds a connection to a random (unpredictable) port and sends a UDP packet through that port to anywhere on the internet. Because F is behind Full-cone NAT, this packet has "hole-punched" the port open, allowing any external addresses to now send packets through that port. In my case the port that I opened locally appears to be the same as the external port. (If the external port was different from the local port, we could use something like STUN to figure out the external port.)
Device S binds a connection to a random (unpredictable) port and sends a UDP packet through that port to the external IP and port of Device F. This packet can be delivered, because the port was hole-punched open in step 1. After this packet, Device S has hole-punched through its own port, allowing packets from Device F to be sent back through it. Device F will know where to send the packets, because the packet that S sent contains the external IP and port. Because Device S has Symmetric NAT, the hole-punching didn't open the port for all traffic, only for traffic from Device F (the external IP and port of Device F).
I used Python to verify that I am able to open a P2P connection between these devices and send messages in both directions, as described above. I don't understand why WebRTC is unable to achieve a P2P connection like this.
Why do I claim that WebRTC fails to connect in some of these situations where a connection should be easy? I claim this because I tried to achieve a WebRTC P2P connection with 3 different code examples. All of the examples worked when devices were in the same local network, but didn't work when the devices were in different networks (the setup described above). None of the libraries I tried provided any useful debugging information to figure out what went wrong, and chrome://webrtc-internals didn't provide any useful information either. I also tried in Firefox to verify that this issue is not implementation-specific.
To be specific, I tried the following code examples:
simple-peer, a WebRTC library. I tried the first example code in the README.
PeerJS, another WebRTC library. I tried the demo page they have set up.
A code snippet from a Stackoverflow answer. [3]
In all 3 experiments I also tried to switch which device initiates the connection and which device receives it.
Since this issue isn't specific to WebRTC implementation, and it isn't specific to any particular WebRTC JavaScript library, I'm beginning to suspect that the spec for WebRTC is broken in some fundamental way, preventing WebRTC from achieving P2P connections in situations where they would be easy to achieve. Am I missing something here?
[1] https://webrtc.org/getting-started/turn-server states: "For most WebRTC applications to function a server is required for relaying the traffic between peers, since a direct socket is often not possible between the clients". This gives the impression that WebRTC should be able to achieve a P2P connection in a scenario where it is easy to achieve.
[2] https://webrtcforthecurious.com/docs/03-connecting/ states: "WebRTC will gather all the information it can and will go to great lengths to achieve bi-directional communication between two WebRTC Agents." This also gives the impression that WebRTC should be able to achieve a P2P connection in a scenario where it is easy to achieve.
[3] WebRTC datachannel with manual signaling, example please?

Is there API that can make possible for a browser to PUSH media chunk to another browser

I am a beginner and I am trying to set up a peer-assisted media streaming system, that will work over the web browser. I wish to a server to 'push' media segments to a few clients and then any of these client browsers to push media segment to other client browsers.
I got to know that HTTP/2.0 can make this possible, but I found examples on a server to a client browser.
I came across WebRTC technology. however, could not find anything like PUSH technique among client browser.
I came across WebSocket technology. I found that it does PUSHing from the only server to the client.
Kindly direct.
You should use WebRTC + Socket or any other signaling media system to do it.
WebRTC will send the media peer to peer and the signaling server should manage which connections to make and when. I've seen similar product on the web yet.
WebSocket is not fast enought to stream media. WebRTC is far better when talking about media.

Is WebRTC without any server not even a signaling server possible?

I'm trying to setup an a cordova plugin for iOS which implements the webrtc functions without using any server and it will only be used on a local network. I know there is this plugin, which looks promising but i have some problems with it.
My plan is not to use a TRUN, STUN or any kind of signaling server.
Maybe you think right now: "Ok this is not possible. No signaling equals no connection." But let me explain first. As pointed out here and
here it's possible to avoid using a TRUN, STUN or ICE server. I think this is a good way to start my project but there is still an open question. How shall the devices find each other if there isn't any kind signaling (in the example they use a Node.js server)? Right now i'm playing with the idea of an QR-Code which contains all the necessary information.
At the end it should look like this (black arrwos are more important):
The idea is that everyone who comes into a room has to scan a QR-Code on the RP and then the device knows the IP, port, etc. of the RP and a WebRTC connection with a DataChannel will be established.
I've been looking for an answer for days now, but due to the fact (or at least one of the reasons) that WebRTC is not even supported on iOS nativly there aren't many WebRTC examples out there which work on iOS and no one for a local network.
So my question is: Am I on the right way or is this not even possible? (I found no example for this anywhere, but if I put all the posts I read together, I think it should be possible.)
First of all, TURN and STUN are not signaling server. Signaling server is the term normally associated with the backend server that let's you relay the messages between two peers before the connection is established. The signaling server is thus used to establish the connection. Once the connection is established, there is no role of the signaling server in the communication, unless you intend to make any changes to the connection parameters.
TURN and STUN servers, on the other hand, are used during the connection establishment process. It helps the two peers find a direct path to each other. So when the connection is established, the peers can talk directly with each other and they don't require the signaling server to relay the messages anymore.
Now coming to your question, short answer is, no, your plan is incomplete.
Here are some changes that you'd need in order to make it work:
QR Code is not adequate to convey all required information. According to this answer, they can store roughly 4kb of maximum data. Thus it is not sufficient to pass all candidates.
Not to mention that WebRTC requires both devices to share the candidates. So, you'd need a display and QR code scanner on the Raspberry PI.
You might want to explore alternatives such as Wifi to allow for two-way data sharing between the device and Raspberry Pi. Once setup, the Wifi connection will act as the Signaling server.
Though I am not well versed in iOS or Raspberry Pi. So I would recommend that you ask a separate question about the choice of communication channel if you are unsure about what to choose. Keep in mind that you need Raspberry Pi to be able to communicate with the device for a short period of time in order to allow WebRTC connection to be established.
Coming to STUN and TURN servers, you may be able to get away without using them. I have seen a few cases when my app is able to establish connection to peers within the local network without STUN and TURN servers.
However, I would strongly recommend that you use at least a STUN server. They are often available for free of charge. Google and Firefox also provide their own STUN servers that you can use in any of your WebRTC apps. You can search on internet to get their details.
TURN servers are required only when the two peers are behind NAT's. In such cases STUN servers are sometimes incapable of finding a direct route between them, and you need the TURN server to relay the audio/video/message stream.
Your plan to establish the WebRTC channel between Raspberry Pi and the phones (the black arrows) seem fine to me. It would help you establish further connections between two phones whenever required.
However, if you eventually decide to implement something like Wifi on your Raspberry Pi, the WebRTC connection may be redundant. After all, you could use Wifi to pass the data back and forth, and don't really need an additional layer of WebRTC channel to do that.
Since you run your app on a local network you don't need STUN and TURN servers. But still you need a signaling server. Signaling can't be done with QR-codes, read more about WebRTC and you will understand why.
But a signaling server can be very simple. Since you have that raspberry pi in your local network, you can use this as your signaling server. Just install node, express and socket.io on it. You need only one simple javascript file, mine is only 23 lines of code. Stop wasting your time with QR-codes and you will have your signaling server up and running in no time. You can look at Google Codelab for an example. Hopes this helps you !!

WebRTC live video stream node.js

I am searching for a way to stream video with webRTC. I am very new to webRTC. I have seen a lot of applications on the web that have p2p video chat. The tutorials I follow explain how WebRTC working for the client, but they do not show what use a backend script. And that's exactly what I'm looking for, a backend script (preferably node.js) that ensures that I can stream live video (getUsersMedia) to the client.
Marnix Bouhuis
Its really simple to get started, checkout a simple demo here
1.You need a WebRTC supported browser. Chrome and firefox are best at now
A signalling server to exchange a media options. SocketIO with Nodejs
TURN and STUN server to solve NAT and Symmetric NAT (Only if you public)
MCU, if you want to limit the bandwidth usage. It give flexibility to a star network rather than mesh network in normal p2p

How to implement video conferencing feature inside a website using webRTC?

Recently I was working on a webRTC project that displays media stream in users browser. However this was only on the client side. What if I want to stream this media to other users browser. As I looked around I found that it was possible by connecting to peers and setting up signalling servers (STUN & TURN). I went through all details that was mentioned on one of the articles on html5rocks website.
I am making use of simplewebRTC but that isn't enough I have to set up my own signalling server in order to be actually able to video chat.
My question is what actually is needed in order to implement a live video chat application embedded within website apart from the api provided by webRTC and how do I set up my own signailling server.
signalmaster was built as a signaling server for simplewebrtc and used by talky.io. It's a node application, start it with "node server.js" and then hook up simplewebrtc to the socket.io endpoint provided.
STUN and TURN servers are not signaling servers. They just help with punching a hole through NAT. The most popular option is rfc-5766-turn-server, restund performs quite well too.
You should provide more detail about your project to get a good answer. Are you planning on making only browser to browser calls? SIP calls? These would be a factor in the signalling server you choose. I went with a SIP signalling server (SIPML5.org) and integrated it with an Asterisk server for call control. This also let me integrate my existing corporate telepresence devices into the PBX. If you want to read up on the basics of signalling and on Webrtc in general Muaz Khan has done some very good work on it.
https://github.com/muaz-khan/WebRTC-Experiment/blob/master/Signaling.md

Categories

Resources