I'm using the SoundCloud public API for playing audio in a browser from the SC servers with the JavaScript SDK 3.0.0. After initialization, I managed to get a JSON with a specific track's stream URLs with the SC.Stream method.
{
"http_mp3_128_url":"https://cf-media.sndcdn.com/a6QC6Zg3YpKz.128.mp3...” ,
"hls_mp3_128_url":"htt...//ec-hls-media.soundcloud.com/playlist/a6QC6Zg3YpKz.128.mp3/...” ,
"rtmp_mp3_128_url":"rtmp://ec-rtmp-media.soundcloud.com/mp3:a6QC6Zg3YpKz.128?...",
"preview_mp3_128_url":"htt....../ec-preview-media.sndcdn.com/preview/0/90/a6QC6Zg3YpKz.128.mp3?..."
}
In it, there is an HTTP, an HLS and an RTMP URL. I can handle the HTTP, but I can't get the RTMP working. Does anyone know how is it decided which stream will be played? And how can I manipulate this? Or how can I access the RTMP stream?
A few weeks ago I checked with WireShark that SoundCloud delivered via RTMP, but now I can't seem to capture any RTMP streams, and I don't know how to search for one.
Usually RTMP stream is used from Flash Media Server, Wowza Media Server and Red5 server.
You can play that type of stream using a flash object in your web page like:
enter link description here
Or for application - you can play with ffplay and convert to other type of stream with ffmpeg
I've been working on the same thing. It plays using the HTTP protocol in Dev mode and then reverts to attempting to use the RTMP protocol in normal browsing mode (at least in chrome anyway). Here's how I solved the issue..
When you use the sc.stream request it will return the object to play. You can edit this object before it gets sent to the player.
For example:
SC.stream('/tracks/'+playr.currentTrack.id).then(function (x) {
x.options.protocols=["http"];
x.play();}
Setting the protocol object parameter as above forces it to use the correct protocol, if you console log it first by trying to play the track in non-dev mode you'll see it also contains the ["rtmp"] protocol, and then fails to play in chrome.
Related
I am trying to build an application that can consume a video source(could be from a webcam or an offline video) and stream it in real-time to a web-page. I have been successful in creating a rtsp stream using gstreamer, but I am unable to receive this stream on the web page without an intermediate step i.e. converting the stream to a playlist.m3u8 using hlssink or ffmpeg.
I want the stream to be directly consumed by the web-page. Also, Is using the vlc-plugin my only option?
Any help would be much appreciated.
RTSP is not going to work over browser because most browsers do not support direct RTP streaming. If for some reason HTTP adaptive streaming protocols like HLS are not satisfying your requirements (e.g. latency not low enough), you can try WebRTC which is among others built on top of secure RTP (SRTP). It has a probably more involved setup than an RTSP server but is nowadays supported by all major browsers. You can check out the webrtcbin element for a GStreamer implementation.
Don't think it's possible since RTSP is not supported by any browser directly, and plugins support was removed by most of the modern browsers.
So the only solution is do conversion from RTSP to some supported by browsers format.
Thanks for the comments! I was able to make this work using Gstreamer's WebRTC example from : https://gitlab.freedesktop.org/gstreamer/gst-examples/-/tree/master/webrtc.
I have implemented video streaming from a Java server to a website using WebSockets and Media Source Extensions (JavaScript). This works fine for nearly every browser on several operating systems except iOS. I am aware of the fact that MSE is not supported on iOS (yet).
Is there any way to easily enalbe video streaming for iOS clients using the same (already existing) technology via web sockets?
I think of something similar to Media Source Extensions, so that I just have to reimplement the client side.
My workflow is:
Create a HTML5 video element and Media Source
Create a new web socket and request video data from the server
Transcode video using FFmpeg and stream the result to stdout
Send the binary video data in chunks to the client
Add the video binary data to the source buffer of the HTML5 <video> element which is linked to a MediaSource with a SourceBuffer.
Hoping for any advice.
If needed, you can use the <video> tag. Look under "Provide Alternate Sources", you can use a HTTP live stream.
i need to send a live streaming from pc to pc , both of them using just the web browser (IE, firefox o chrome), exist a library (javascript) that could help me to push the stream from the sender to the media server (ffmpeg-ffserver, wowza, etc).
I guess you want to stream a video signal from the webcam. Then the way to go is to use webRTC, but it is still very new (wowza server just started to support it) and it is only supported in some modern browsers. So you will encounter many issues.
Most of the existing solution still use flash to capture from the webcam and encode in rtmp.
I am new in webRTC . As I know WebRTC is use for real time communication . In spec it seems that Stream can be created only by device outout( using GetUserMedia for microphone , camera or chrome tab capture api ) .But in my application i am getting real time Uint8DVideo( eg H264) data . Can i convert this uint8Data to MediaStream ?
I assume you don't use getUserMedia, but some arbitrary source.
Getting this video "buffer" to be displayed is tricky and not possible in every browser (only Chrome and soon Firefox). You don't need WebRTC to do that, but something called Media Source API AKA MSE (E for extensions).
The API is rather picky on it's accepted byte streams, and will not get any "video data". For H264, it will only except fragmented MP4. more info about that here.
I'm trying to stream a (WebM or MP4) video from Node.js to HTML5 using websockets (the websocket library is Socket.IO on both server and client). The browser in use is the latest version of Chrome (version 26.0.1410.64 m).
I saw here that it's possible to push a video stream in the video tag from a file using the MediaSource object.
My idea is to read chunks of data from the websocket instead of a file.
Can someone please post an example using websockets to accomplish that or explain me how to do it?
Thanks in advance.
In addition to the text (string) messages, the WebSocket API allows you to send
binary data, which is especially useful to implement binary protocols. Such binary
protocols can be standard Internet protocols typically layered on top of TCP, where the
payload can be either a Blob or an ArrayBuffer.
// Send a Blob
var blob = new Blob("blob contents");
ws.send(blob);
// Send an ArrayBuffer
var a = new Uint8Array([8,6,7,5,3,0,9]);
ws.send(a.buffer);
Blob objects are particularly useful when combined with the JavaScript File API
for sending and receiving files, mostly multimedia files, images, video, and audio.
Also i suggest to see WebRTC (Technology associated with WebSockets) Web Real-Time Communication (WebRTC) is another effort to enhance the communication capabilities of modern web browsers. WebRTC is peer-to-peer technology for the Web. The first applications for WebRTC are real-time voice and video chat. WebRTC is already a compelling new technology for media applications, and there are many available sample applications online that enable you to test this out with video and audio over the Web. Please check this link