Use node.js as a WebRTC peer? - javascript

What modules are in existence to use node.js as a peer in WebRTC? I'd like to use WebRTC in a more client/server fashion than P2P for its apparent ability to send packets unreliably. (AKA, I don't want the huge delay TCP makes by guaranteeing packet arrival with data in order)
If I have to use a stripped-down browser page as a server, that would perhaps work... however, it would really be sub-optimal. Node.js would make things much smoother, and probably more reliable too.
Thanks!

Have a look at the Erizo component of Licode (WebRTC MCU). It has a stream controller and webrtc controller written in c++ with a js interface. It might help you getting a idea or two.

There is now a Node implementation of WebRTC, with the exception of MediaStreams.
https://github.com/js-platform/node-webrtc

There is a c++ interface for WebRTC. WebRTC is based on the libjingle project but uses the JSEP (Javascript Session Establishment Protocol) instead of XMPP for sending STUN/TURN information for NAT tunneling. The two projects were in the process of being merged when I looked at this a while back so compiling/linking it was a PITA. This may have been improved last year.
The goal would be to expose the native API for WebRTC as a node module with the node addon api and package it as an npm module that works like the in-browser API. The cross-browser polyfill will show you how it should look.
There's a lot of cool stuff you could do with this (call recording, SIP connectors, .torrent extensions to the browser, etc.) I really encourage you to try this!

The most relevant package i've found was http://js-platform.github.io/node-webrtc/ i managed to build it and play with it a little bit... The developer is very helpful, i think it's your best bet right now

The solution is to use libjingle or licode/erizo. Both of them require compilation but erizo provides a NodeJS interface. Libjingle was created by Google.
Unfortunately, you have to compile each library and there are no binary packages for Debian, Ubuntu or other platforms.

Take a look at PeerJS: Simple peer-to-peer with WebRTC.
You need PeerJS-server for signaling.
The guide: http://peerjs.com/

I used Node js with socket io and have success with it
There are many tutorials online

Related

Is there a way to connect two browsers without WebRTC?

I want to create a Peer to Peer connection with two browsers without using existing code (mostly). I want to implement the server infrastructure by myself, as well as the client code.
There is just one issue, WebRTC seems to be everywhere.
Don't get me wrong, I'd use it, but since this is for a school project I have to implement almost everything by myself.
Looking at the WebRTC source code, I expected some Javascript implementations of existing components, however, all I ended up seeing is very complex C++ code that is intended for web browser developers.
Is it possible to implement a Peer to Peer connection between two browsers without using WebRTC?
For security reasons browsers do not allow you to make UDP and TCP requests yourself. You need to use one of the 3 protocols provided by browsers
HTTP
WebSockets
WebRTC
The C++ code you saw is the underlying implementation that browsers can use for WebRTC

Making a node.js application a PEER with WebRTC

So, I have a web app that generates large buffers of color information that I want to send to a node application running on another machine in my local network. Web Sockets doesn't seem to be fast enough for me. I was looking to use UDP and it seems WebRTC is the only way to do that from a browser. The caveat, it seems, is WebRTC is only PEER to PEER (browser to browser). I figured, I could use node webkit to emulate being my other "PEER". In my node app I could handle the "signaling" and have it set itself up in a RTCPeerConnection to my web app. Therefore, I could send my data from my web app to my node app (local network). For some context, I have one computer running native software to drive a light fixture and I want to use a web app to control the lights.
To boil the question down, how can I make a RTCPeerConnection from a browser to a node webkit app?
Any help would greatly appreciated.
Thank you!
-Jake
Node-RTCPeerConnection is an attempt (current WIP) to create a spec compliant implementation of RTCPeerConnection for Node.js entirely in JavaScript with no native C or C++ code. This enables browser-peers to speak to non-browser (Node.js) peers.
But you can not use it for production yet.
Then we also have wrtc (node-webrtc) that provides a native module for NodeJS that supports a subset of standards-compliant WebRTC features. Specifically, the PeerConnection and DataChannel APIs.
Too many people are having problems with wrtc. Since it has to download lots of source and build it only to find out that it fails after a long while on certain platforms. Unfortunately it doesn't come with any prebuilt packages described in this issue
You can use either the google implementation of webrtc or a more recent implementation (by Ericsson) called openWebrtc. The developers of openWebRTC are very proud of running their implementation on various pieces of hardware like raspberry pi and iOS devices.
The one that worked best for me was electron-webrtc (which in turn uses electron-prebuilt) for better compatibility. It creates a hidden Electron process (which is based on Chromium, so WebRTC support is great!) and communicates with that process to enable WebRTC in Node.js. This adds a lot of overhead.
It is intended for use with RTCDataChannels, so the MediaStream API is not supported.
Other resources:
https://github.com/webrtcftw/goals/issues/1
Update 2019
Currently, the best and easiest way to solve this problem is to use webrtc module. Check samples for inspiration. This module does what you were looking for, implemented with N-API and using Canvas module to compose new video from the client stream. Hopefully this will help those who face this problem in the future.

Audio manipulation using node.js

My team has been using the Web Audio API/Getusermedia in a product and we are going really well with our chrome and firefox users. But we still have a large base of users that we would love to reach, but due to technology barriers, we still can't (mostly, those are IE users), as their main browser does not support the technology, and they do not or can not change to a modern browser.
We are planning to get to those users, but we don't want to go to Flash, Flex, Silverlight or anything similar.
So, thinking about solutions, I thought that maybe I could pass by this difficulty if I moved the audio manipulation, from the browser to the server. NodeJS was the first answer when trying to figure out how to do it.
Would it be possible to be done using NodeJS? Are there any libraries available that would help us accomplish this? Are there any other technologies that would allow me to do this?
Thanks anyone that could help.
It could easily be done. Node is simply an IO engine designed for rapid response. If it needs to happen in real time then I imagine latency would be a usability-breaking issue due to networking restraints. If it doesn't, then I think it would be a great solution! :)
Either way here are a couple related resources
https://www.npmjs.org/package/webrtc.io <- latency optimization library intended for work with media streams
http://wac.ircam.fr/ an upcoming conference (Jan 2015) dedicated to the types of problems you are dealing with.
http://www.sitepoint.com/5-libraries-html5-audio-api/ A few web libraries for use with audio. #3 and #4 look like they are related to what you are trying to do
You can try using this (is in development):
Node Web Audio API
https://github.com/sebpiq/node-web-audio-api
Installation
npm install web-audio-api
Demo
node test/manual-testing/AudioContext-sound-output.js

streaming data using the LwIP server

I have an embedded system which is running the LwIP server(v1.2), I need to be able to stream a data array into the javascript on the client side? I'm looking at using chrome and some HTML5 features, so some people have suggested using websockets. Does anyone know where I need to start to use these with the LwIP framework? Any help at all would be much appreciated!
WebSockets is a relatively simple protocol so you could use the protocol spec and write your own server. Since lwIP offers a bsd sockets API, you could also search for existing open source C servers. (A quick search shows up this candidate for instance. BTW, note that this code licensed as GPL. You should only use it if you understand the requirements using GPL'd code puts on your project.)
Note that while Chrome support for websockets is good, support is patchier if you later decide to use other browsers (and particularly to allow users with older browsers). See here for details. If support for a variety of browsers matters to you, you'll probably have to include code in client and server to fallback to long polling when a websocket handshake fails.

Implement WebSockets in Android using Native Sockets

There is a javascript code residing in my Android app's webkit container. This code makes use of WebSockets to communicate with the server. The same JS code works in other platforms such as iOS, but in Android 2.3, it doesn't. I read somewhere that the webkit in Android does not support WebSockets, and support will come only in Jelly Bean onwards.
In view of this, I need to provide a wrapper from the native layer (in Java) making use of plain sockets. From the little I know about sockets, it seems straightforward to support the usual APIs such as open(), send(), receive(), etc.
Is there anything else I need to know that the WebSocket protocol needs, which I will need to provide from the wrapper code? After all, the server talks to the client (my android app) as if it is a WebSocket, and not a native socket.
Some notes to consider: a) I cannot make use of any third-party library - it will have to be developed in-house. b) There will not be any binary data transferred; only text.
Thanks,
Rajath
Websockets are not raw sockets, they require an initial handshake then simple per-message framing. See the protocol spec for details. The sections on handshake and data framing will be most relevant.
I know you said you can't use third party libraries but be aware that projects like Java-Websocket might be interesting to you. It's liberally licensed so suitable for inclusion in any closed source app. Or you might find it useful as a reference while debugging your own code.

Categories

Resources