WebRTC video streaming doesn't work through http - javascript

I'm trying to create game with WebRTC (Peer.js). And I can't make video calls through "http"... Maybe It works only through https?
P.S. All working (that I saw) examples for media calls use https!
1) http://cdn.peerjs.com/demo/videochat/ (doesn't work)
2) https://simplewebrtc.com/demo.html (works)

It's not webrtc but the getusermedia API is supported only over secure origins (https://www.chromium.org/Home/chromium-security/deprecating-powerful-features-on-insecure-origins).
So you can use localhost for testing on your machine but for deployment, you will need https.

Related

Streaming video in real-time using a gstreamer-rtsp server to a web page

I am trying to build an application that can consume a video source(could be from a webcam or an offline video) and stream it in real-time to a web-page. I have been successful in creating a rtsp stream using gstreamer, but I am unable to receive this stream on the web page without an intermediate step i.e. converting the stream to a playlist.m3u8 using hlssink or ffmpeg.
I want the stream to be directly consumed by the web-page. Also, Is using the vlc-plugin my only option?
Any help would be much appreciated.
RTSP is not going to work over browser because most browsers do not support direct RTP streaming. If for some reason HTTP adaptive streaming protocols like HLS are not satisfying your requirements (e.g. latency not low enough), you can try WebRTC which is among others built on top of secure RTP (SRTP). It has a probably more involved setup than an RTSP server but is nowadays supported by all major browsers. You can check out the webrtcbin element for a GStreamer implementation.
Don't think it's possible since RTSP is not supported by any browser directly, and plugins support was removed by most of the modern browsers.
So the only solution is do conversion from RTSP to some supported by browsers format.
Thanks for the comments! I was able to make this work using Gstreamer's WebRTC example from : https://gitlab.freedesktop.org/gstreamer/gst-examples/-/tree/master/webrtc.

Problem with streaming audio in mobile with Android 9 version?

I was making an app in react-native where I had to stream audio from a http (non secure) url. The app works well in debug mode but when I tried using the release version the audio didn't play in phones having Android 9 and above. I am using this Library.
I can't create the SSL for the link. Also I have tried with some SSL secure links (https) the library works fine with Secure Links, but due to certain restrictions I have to use the http link only.
Note: The audio streams well in Android 8 and below. I have tried using a WebView to play the audio didn't work.
Update:
1: Get yourself a https link for streaming your audio.
( If you can't do that )
2: Host up a proxy audio streaming server with your non secure http link
Keep coding 👍

Real-time streaming from Android camera to browser

We are working on an IP camera Android app that should stream the video took in real-time by the Android camera to a Web page served by the same app and accessed through WiFi only.
The app currently use a pseudo-streaming method (an image sent using HTTP with no-store), but it is not robust enough, so we need to change it for a better streaming method. We also need to support multicast (or at least an optimized "multi-unicast"), and if possible use an UDP protocol (or at least a low-latency TCP protocol).
We cannot use any intermediary server (so no Wowza or the like, unless it is also served by the app) or any browser plugin (so no VLC or the like, unless it is served by the app too). The main browser it is used on is Chromium.
We searched for and tried a lot of methods but none worked for us :
WebRTC sounds cool, but it uses an intermediary signaling server, it doesn't support multicast, and it is kind of heavy for what we want
RTSP with libstreaming sounds cool too, but no browser seems to implement it, and we couldn't find a Javascript library to do it.
RTMP works on most browsers, but we could'nt find a working Android library
Which streaming method would be best for our needs, and do you know Javascript and Android libraries implementing them ?
There is no way to stream multicast to a browser.

How to play SoundCloud RTMP stream?

I'm using the SoundCloud public API for playing audio in a browser from the SC servers with the JavaScript SDK 3.0.0. After initialization, I managed to get a JSON with a specific track's stream URLs with the SC.Stream method.
{
"http_mp3_128_url":"https://cf-media.sndcdn.com/a6QC6Zg3YpKz.128.mp3...” ,
"hls_mp3_128_url":"htt...//ec-hls-media.soundcloud.com/playlist/a6QC6Zg3YpKz.128.mp3/...” ,
"rtmp_mp3_128_url":"rtmp://ec-rtmp-media.soundcloud.com/mp3:a6QC6Zg3YpKz.128?...",
"preview_mp3_128_url":"htt....../ec-preview-media.sndcdn.com/preview/0/90/a6QC6Zg3YpKz.128.mp3?..."
}
In it, there is an HTTP, an HLS and an RTMP URL. I can handle the HTTP, but I can't get the RTMP working. Does anyone know how is it decided which stream will be played? And how can I manipulate this? Or how can I access the RTMP stream?
A few weeks ago I checked with WireShark that SoundCloud delivered via RTMP, but now I can't seem to capture any RTMP streams, and I don't know how to search for one.
Usually RTMP stream is used from Flash Media Server, Wowza Media Server and Red5 server.
You can play that type of stream using a flash object in your web page like:
enter link description here
Or for application - you can play with ffplay and convert to other type of stream with ffmpeg
I've been working on the same thing. It plays using the HTTP protocol in Dev mode and then reverts to attempting to use the RTMP protocol in normal browsing mode (at least in chrome anyway). Here's how I solved the issue..
When you use the sc.stream request it will return the object to play. You can edit this object before it gets sent to the player.
For example:
SC.stream('/tracks/'+playr.currentTrack.id).then(function (x) {
x.options.protocols=["http"];
x.play();}
Setting the protocol object parameter as above forces it to use the correct protocol, if you console log it first by trying to play the track in non-dev mode you'll see it also contains the ["rtmp"] protocol, and then fails to play in chrome.

Webrtc app not working on local host?

I am using a WebRTC demo application for screen sharing. The demo works perfectly fine but when I try to run the same code in localhost or my own remote server, the code doesn't run.
Any ideas on how can I fix this issue?
Screensharing in Chrome only works over an SSL connection(). You can use a self-signed cert and simply accept it in your browser(this is for Chrome < M36).
Also, for Chrome >M36 you must now use the Chrome.desktopcapture API and the usual way of modifying media constraints and enabling screen-sharing in Chrome internals will not work in newer versions of chrome.
The API is farely simple and MUCH more robust than the previous given option in the media constraints.

Categories

Resources