broswer push camera stream to media server - javascript

everyone .I need some help for some stream problems, here is my to do list.
I want stream my camera stream,microphone stream via broswer to a media server, and a python server
need to pull this stream to do some asr, tts things. After that ,It generate a reply stream and push back to media server ,and the broswer pull this stream. My plan is like this.
broswer stream to rtmp server use rtmp protocol
python server pull this stream use rtmp
python server push its reply to rtmp server
broswer pull it
but, my question is how can broswer stream rtmp to rtmp server? cause As far as I know I can just use webrtc to do this in broswer?
so the process maybe like this?
broswer stream to janus(or other webrtc server) use webrtc
python use webrtc to pull this stream
python server push its reply to janus
broswer pull this stream use webrtc
but, I'm not sure whether step 2 or step 3 can be done, cause I don't know how to use webrtc in a python environment and without a broswer and use my own stream(not a camera stream)
or can janus convert and push webrtc stream to rtmp server ?
Any helps will be appraciated, thanks.

I know that with mediasoup you can send the camera to the server with webrtc. Then you can use the server to retransmit it to your algos with rtp and lastly send the response back to the client using webrtc again.

Related

Realtime WebSocket stream to RTSP

I've been looking for days a solution to convert a video stream (video/webcam) from a web browser to a backend RTSP stream.
All I could find was backwards, from RTSP to WebSockets (to display in a web page).
I want the user to choose from a web browser client, a video or webcam locally and then send it to a nodejs server.
Opening a webcam and send the chunks via websockets, seems easy, but how to "convert" these chunks of video to a RTSP server, and then connect via VLC to see the stream ?
Thank you in advance.

Can MQTT stream audio directly to a web client?

I was able to set up an Arduino to stream audio from a microphone into a linux server that's hosting an MQTT server. I then have a golang script that subscribes to the MQTT server, saves payload to disk as a binary file, and converts the binary file to a .WAV file with FFMPEG.
Is it possible to have a web browser use only client side code to subscribe to the same MQTT server, receive the audio payload from the Arduino, and stream the audio in near-real-time to the human listener's computer speakers? I see a Paho Javascript Client library that can help me connect to MQTT, but it seems to receive payloads as string, which isn't evident to me on how I'd stream audio content with. Hence, why I'm asking if it's even practical/feasible?
Or will I need to build another server-side script to stream MQTT data as audio data for a web client?
To ensure it works in all environments, ensure that you use MQTT over WebSocket to connect to the server.
Here is a discussion of this: Can a web browser use MQTT?
Look closer at the paho doc, there is a fiction to get the message payload as binary data using the message.payloadBytes field.
payloadBytes | ArrayBuffer | read only The payload as an ArrayBuffer
An example is described here:
https://www.hardill.me.uk/wordpress/2014/08/29/unpacking-binary-data-from-mqtt-in-javascript/
But basically you end up withan ArrayBuffer holding the binary data that you can then coonvert to a Typed Array and read back values at what ever offset you need.

WebRTC live video stream node.js

I am searching for a way to stream video with webRTC. I am very new to webRTC. I have seen a lot of applications on the web that have p2p video chat. The tutorials I follow explain how WebRTC working for the client, but they do not show what use a backend script. And that's exactly what I'm looking for, a backend script (preferably node.js) that ensures that I can stream live video (getUsersMedia) to the client.
Marnix Bouhuis
Its really simple to get started, checkout a simple demo here
1.You need a WebRTC supported browser. Chrome and firefox are best at now
A signalling server to exchange a media options. SocketIO with Nodejs
TURN and STUN server to solve NAT and Symmetric NAT (Only if you public)
MCU, if you want to limit the bandwidth usage. It give flexibility to a star network rather than mesh network in normal p2p

Sending a MediaStream to host Server with WebRTC after it is captured by getUserMedia

I am capturing audio data using getUserMedia() and I want to send it to my server so I can save it as a Blob in a MySQL field.
This is all I am trying to do. I have made several attempts to do this using WebRTC, but I don't even know at this point if this is right or even the best way to do this.
Can anybody help me?
Here is the code I am using to capture audio from the microphone:
navigator.getUserMedia({
video:false,
audio:true,
},function(mediaStream){
// output mediaStream to speakers:
var mediaStreamSource=audioContext.createMediaStreamSource(mediaStream);
mediaStreamSource.connect(audioContext.destintion);
// send mediaStream to server:
// WebRTC code? not sure about this...
var RTCconfig={};
var conn=new RTCPeerConnection(RTCconfig);
// ???
},function(error){
console.log('getUserMedia() fail.');
console.log(error);
});
How can I send this mediaStream up to the server?
After Googling around I've been looking into WebRTC, but this seems to be for just peer to peer communication - actually, now I'm looking into this more, I think this is the way to go. It seems to be the way to communicate from the client's browser up to the host webserver, but nothing I try even comes close to working.
I've been going through the W3C documentation (which I am finding way too abstract), and I've been going thru this article on HTML5 Rocks (which is bringing up more questions than answers). Apparently I need a signalling method, can anyone advise which signalling method is best for sending mediaStreams, XHR, XMPP, SIP, Socket.io or something else?
What will I need on the server to support the receiving of WebRTC? My web server is running a basic LAMP stack.
Also, is it best to wait until the mediaStream is finished recording before I send it up to the server, or is it better to send the mediaStream as its being recorded? I want to know if I am going about doing this the right way. I have written file uploaders in javascript and HTML5, but uploading one of these mediaStreams seems hellishly more complicated and I'm not sure if I am approaching it right.
Any help on this would be greatly appreciated.
You cannot upload the live stream itself while it is running. This is because it is a LIVE stream.
So, this leaves you with a handful options.
Record the audio stream using one of the many recorders out there RecordRTC works fairly well. Wait until the stream is completed and then upload the file.
Send smaller chuncks of recorded audio with a timer and merge them again server side. This is an example of this
Send the audio packets as they occur over websockets to your server so that you can manipulate and merge them there. My version of RecordRTC does this.
Make an actual peer connection with your server so it can grab the raw rtp stream and you can record the stream using some lower level code. This can easily be done with the Janus-Gateway.
As for waiting to send the stream vs sending it in chunks, it all depends on how long you are recording. If it is for a longer period of time, I would say sending the recording in chunks or actively sending audio packets over websockets is a better solution as uploading and storing larger audio files from the client side can be arduous for the client.
Firefox actually has a its own solution for recording but it is not supported in chrome so it may not work in your situation.
As an aside, the signalling method mentioned is for session build/destroy and really has nothing to do with the media itself. You would only really worry about this if you were using possibly solution number 4 shown above.
A good API for you would be MediaRecorder API but it is less supported than the Web Audio API, so you can do it using a ScriptNode or use Recorder.js (or base on it to build your own scriptnode).
WebRTC is design as peer-to-peer, but the peer could be a browser and a server. So it's definitely possible to push the stream by WebRTC to a server, then record the stream as a file.
The stream flow is:
Chrome ----WebRTC---> Server ---record---> FLV/MP4
There are lots of servers, like SRS, janus or mediasoup to accept WebRTC stream. Please note that you might need to covert the WebRTC(H.264+Opus) to MP4(H.264+AAC), or just choose SRS which supports this feature.
yes it is possible to send MediaStream to your server, but the only way you can achieve is by going through WebSocket which enable client browser to send data to your server in real time connection. so i recommend you to use websocket

Can I use WebRTC to receive a standard RTP video stream?

I have two computers on the same network. One of them transmits a movie (H264) with RTP protocol. Is it possible to create a simple javascript app to receive this stream on the second computer and display in a video tag?
So far my impression of WebRTC is that it's designed to be used between browser (both using WebRTC api), but I want to use it only on the receiving side.
May be this might help Janus-Gateway.
This has listed RTP in Dependencies
It is possible to stream video using WebRTC, you can send only data parts with RTP protocol, on the other side you should use Media Source API to stream video.
Here is article with demo explained about Media Source API
Sure,
you can use the mediasoup for this it provides apis for receiving and sending RTP.

Categories

Resources