Is WebRTC Conference possible in Star Topology with html5-JS client? - javascript

To Connect 10 people in WebRTC Audio/Video Conference, it requires 90 calls in Mesh-Topology (Each Peer should connect with all other Peers in Conference). If number of participants are more the bandwidth consumption is more for each user.
Is There any way to make WebRTC conference in Star-Topology(i.e. conferencing 10 people with 10 Calls) from browser client without any Hardware Like MCU?
My Requirement is Initiate Audio Conference 'n' people with n calls:
Moderator initiated 3 calls from WebRTC browser client to different users(A,B,C)
with 3 different peer connections. Now Moderator can able to here/speak with all three. Now Moderator Want to Conference all Three(A RemoteStream.AudioTracks to B&C, B Audio-Tracks to A&C, C Audio-Tracks to A&B). Without any new peer connections from A,B,C.
Is it Possible to mix audio tracks in Chrome/Firefox ......?

Yes, exactly what you described is possible in Firefox by using the WebAudio API. At the current time it is not possible in Chrome because it cannot create a MediaStreamAudioSourceNode from WebRTC streams (I hope this limitation will go away soon). Thus, the moderator's browser must be Firefox. The other peers can use other browsers.
This way you can set up a conference call with 10 peers, all of them only connecting to the moderator, thus using only 10 WebRTC connections.
What you have forgotten to mention is that you also have to mix in the moderator's audio for each peer.
With the WebAudio API you also could do some fancy things like per-peer audio visualization, muting, volume control, mixing in of audio files, etc.

WebRTC gives you the possibility to stream audio to somewhere. Whether that "somewhere" is a server or another client is up to you and does not depend on the protocol. For your scenario, what needs to be built, is a central box that implements moderator control and executes audio mixing.

Am answering my old question (posted when i was an Amateur WebRTC Programmer).
The Solution is using WebAudio API, available in both Chrome & Firefox.
Here the great post by WebRTCHacks Mixing Audio in browser
It explained everything step by step.

Related

Live Audio Streaming over HTML5/NodeJS

I'm trying to make a website that will serve as a VoIP recorder app. It will take audio from the microphone, transmit the audio to the server and the server only, and then the server will handle the redistribution of audio to it's connected clients.
Here's what I've tried already:
WebRTC (from what I can tell, it's peer-to-peer only)
MediaRecorder - timeSlice to Socket.IO (only the first packet is playable due to header information)
MediaRecorder - Stopping every few milliseconds, transmitting the audio, and starting again. (is extremely choppy)
The stack I'm set on is NodeJS with Express, but I'm extremely open to any packages that will help.
As far as possibility, I know it is possible because Discord wrote in their own blog that they explicitly do not send packets peer-to-peer because they have large numbers of connected users.
Below is the way I imagine it being setup:
Anyways, hope someone can help - I've been stuck on this for a while. Thanks!
WebRTC is NOT only P2P. You can put a WebRTC Peer on a server (and then have it do fan-out). This is what all major conferencing solutions do. SFU is a very popular deployment style, mesh isn't the only thing you can do.
You can go down the MediaRecorder path, but you are going to hit issues with congestion control/backpressure.

How does Youtube/Facebook live stream from web browser works

I'm looking at a way to implement video encoder using web browser. Youtube and Facebook already allow you to go live directly from the web browser. I'm wondering how do they do that?
There are a couple of solutions I've researched:
Using web socket: using web browser to encode the video (using mediarecorder api) and push the encoded video to the server to be broadcast.
Using WebRTC: web browser as a WebRTC peer and another server as the other end to receive the stream and re-broadcast (transcode) using other means (rtmp, hls).
Is there any other tech to implement this that those guys (YouTube, Facebook) are using? Or they also use one of these things?
Thanks
WebRTCHacks has a "how does youtube use webrtc" post here which examines some of the technical details of their implementation.
In addition one of their engineers gave a Talk at WebRTC Boston describing the system which is available on Youtube
Correct, you've hit on two ways to do this. (Note that for the MediaRecorder method, you can use any other method to get the data to the server. Web Sockets is one way... so is a regular HTTP PUT of segments. Or, you could even use a data channel of a WebRTC connection to the server.)
Pretty much everyone uses the WebRTC method, as there are some nice built-in benefits:
Low latency (at the cost of some quality)
Dynamic bitrate
Well-optimized on the client
Able to automatically scale output if there are not enough system resources to continue encoding at a higher frame size
The downsides of the WebRTC method:
Ridiculously complicated stack to maintain server-side.
Lower quality (due to emphasis on low latency, BUT you can tweak this by fiddling with the SDP yourself)
If you go the WebRTC route, consider gstreamer. If you want to go the Web Socket route, I've written a proxy to receive the data and send it off to FFmpeg to be copied over to RTMP. You can find it here: https://github.com/fbsamples/Canvas-Streaming-Example

WebRTC live video stream node.js

I am searching for a way to stream video with webRTC. I am very new to webRTC. I have seen a lot of applications on the web that have p2p video chat. The tutorials I follow explain how WebRTC working for the client, but they do not show what use a backend script. And that's exactly what I'm looking for, a backend script (preferably node.js) that ensures that I can stream live video (getUsersMedia) to the client.
Marnix Bouhuis
Its really simple to get started, checkout a simple demo here
1.You need a WebRTC supported browser. Chrome and firefox are best at now
A signalling server to exchange a media options. SocketIO with Nodejs
TURN and STUN server to solve NAT and Symmetric NAT (Only if you public)
MCU, if you want to limit the bandwidth usage. It give flexibility to a star network rather than mesh network in normal p2p

Why pubnub javascript sdk (?) choses XHR over Websocket?

I'm developing simple browser real-time multiplayer (2 players in a gameplay atm) game. It involves fast and frequent player moves and changes of direction, so informations must be exchanged very quickly - I decided to try websockets (would be happy to use pubnub service instead of self-hosting socket server).
My problem is, pubnub always decides to use xhr fallback instead of websockets - don't know why. Are there any specific requirements that must be fulfilled to run communication via websockets? Http is obviously too slow and kills the experience. I'm using latest Chrome on a Mac, so browser compatibility is not an issue.
Or maybe, there is so many variables to determine communication protocol, that the question cannot be answered? And my only solution is to use self-hosted socket server?
Realtime Protocol WebSockets and XHR with PubNub
Modern data stream networks and open source solutions start with XHR. For several reasons this is optimal to start with including speed. Performance is dependent on the speed of light and how fast Ethernet Frames are able to be transmitted between devices on the internet. This is the foundation for protocol independence and the core determinator for latency and speed of messages across the internet. The PubNub client SDKs, such as JavaScript, do not provide a setting to force a particular protocol.
How PubNub Works
See How PubNub Works scroll down for mouse demo.
PubNub is the fastest global data stream network available today with 15 data centers world wide to support your high speed and low latency requirements. Over 1/4 billion devices connected to the PubNub data stream network experience send/receive speeds from 10ms to 100ms per message.
What is Protocol Independence?
The people behind the PubNub Data Stream Network believe in the Protocol Independence and the open mobile web; meaning that we will use the best protocol to get connectivity through any environment. Protocols, like WebSockets, can get tripped up by cell tower switching, double NAT environments, and even some anti-virus software or proxy boarder authorities.
PubNub provides client libraries specifically so we can auto-switch the protocol and remove socket-level complexities making it easy for developers to build apps that can communicate in realtime.
PubNub has used a variety of protocols over time, like WebSockets, MQTT, COMET, BOSH, long polling and others, and we are exploring currently prototyping future designs using SPDY, HTTP 2.0, and others. The bottom line is that PubNub will work in every network environment, and has very low network bandwidth overhead, as well as low battery drain on mobile devices.
You could also try other cloud services that use WebSockets as the first-option protocol (with XHR fallbacks), like Pusher and Realtime (the company I work for).

Can I use WebRTC to receive a non-standard RTP stream?

I have a piece of software running on a node in my network that generates RTP streams carried over UDP/IP. Those streams contain streaming data, but not in any standard audio/video format (like H.264, etc.). I would like to have a simple Web app that can hook into these streams, decode the payloads appropriately, and display their contents. I understand that it isn't possible to have direct access to a UDP socket from a browser.
Is there a way to, using JavaScript/HTML5, to read an arbitrary RTP stream (i.e. given a UDP port number to receive the data from)? The software that sends the stream does not implement any of the signaling protocols specified by WebRTC, and I'm unable to change it. I would like to just be able to get at the RTP packet contents; I can handle the decoding and display without much issue.
As far as I know, there is nothing in the set of WebRTC APIs that will allow you to do this. As you have pointed out, there also isn't a direct programmatic way to handle UDP packets in-browser.
You can use Canvas and the Web Audio API to effectively playback arbitrary video, but this takes a ton of CPU. The MediaSource extensions can be used to run data through the browser's codec, but you still have to get the data somehow.
I think the best solution in your case is to make these connections server-side and use something like FFmpeg to output a stream in a codec and container that your browser can handle, and simply play back in a video element. Then, you can connect to whatever you want. I have done similar projects with Node.js which make it very easy to pipe streams through, and on out to the browser.
Another alternative is to use WASM and create your own player for your stream. It's pretty incredible technology of these recent years > 2014. Also as stated by #Brad, WebRTC doesn't support what you need even as of this year 2020.

Categories

Resources