I have implemented video streaming from a Java server to a website using WebSockets and Media Source Extensions (JavaScript). This works fine for nearly every browser on several operating systems except iOS. I am aware of the fact that MSE is not supported on iOS (yet).
Is there any way to easily enalbe video streaming for iOS clients using the same (already existing) technology via web sockets?
I think of something similar to Media Source Extensions, so that I just have to reimplement the client side.
My workflow is:
Create a HTML5 video element and Media Source
Create a new web socket and request video data from the server
Transcode video using FFmpeg and stream the result to stdout
Send the binary video data in chunks to the client
Add the video binary data to the source buffer of the HTML5 <video> element which is linked to a MediaSource with a SourceBuffer.
Hoping for any advice.
If needed, you can use the <video> tag. Look under "Provide Alternate Sources", you can use a HTTP live stream.
Related
I am trying to build an application that can consume a video source(could be from a webcam or an offline video) and stream it in real-time to a web-page. I have been successful in creating a rtsp stream using gstreamer, but I am unable to receive this stream on the web page without an intermediate step i.e. converting the stream to a playlist.m3u8 using hlssink or ffmpeg.
I want the stream to be directly consumed by the web-page. Also, Is using the vlc-plugin my only option?
Any help would be much appreciated.
RTSP is not going to work over browser because most browsers do not support direct RTP streaming. If for some reason HTTP adaptive streaming protocols like HLS are not satisfying your requirements (e.g. latency not low enough), you can try WebRTC which is among others built on top of secure RTP (SRTP). It has a probably more involved setup than an RTSP server but is nowadays supported by all major browsers. You can check out the webrtcbin element for a GStreamer implementation.
Don't think it's possible since RTSP is not supported by any browser directly, and plugins support was removed by most of the modern browsers.
So the only solution is do conversion from RTSP to some supported by browsers format.
Thanks for the comments! I was able to make this work using Gstreamer's WebRTC example from : https://gitlab.freedesktop.org/gstreamer/gst-examples/-/tree/master/webrtc.
I'm trying to develop a web application with WebRTC and I'm getting video from my webcam through WebRTC and I want to do live streaming on Facebook and YouTube with my browser I have searched python and node js libraries but I haven't find any library for that. I want to build an application like streamyard.com.
I have watched ffmpeg
You can do this using Pion WebRTC and ffmpeg!
I have created a demo here. If you have ffmpeg installed and the Go compiler this should just work!
This takes audio/video from the browser, and then constructs a webm in memory. It then passes this WebM to ffmpeg via a stdin pipe, which then is transcode and sent to Twitch!
There are a lot of optimizations we could make here (like taking H264 from the browser directly) but H264 isn't supported everywhere, so this just makes the sample easier to reason with.
i need to send a live streaming from pc to pc , both of them using just the web browser (IE, firefox o chrome), exist a library (javascript) that could help me to push the stream from the sender to the media server (ffmpeg-ffserver, wowza, etc).
I guess you want to stream a video signal from the webcam. Then the way to go is to use webRTC, but it is still very new (wowza server just started to support it) and it is only supported in some modern browsers. So you will encounter many issues.
Most of the existing solution still use flash to capture from the webcam and encode in rtmp.
I am trying to setup live internet radio station using icecast server and want my stream to work in all modern browsers. My client only produces .ogg stream. All browsers doesn't play .ogg stream. For eg the .ogg stream I have setup works in chrome but doesn't work in IE. How should I make it run in all browsers?
Is there a way I can convert .ogg stream to .mp3 or any other format on the fly
Embed a audio player in the browser which can play .ogg stream.
Or Any other advice would be helpful.
Regards,
Hitesh Bhutani
You have several options:
Change encoding format from OGG to MP3 in your Virtual DJ software. Keep in mind that Firefox will not be able to play mp3 streams on some platforms using HTML5 audio tag due to licensing limitations.
Install some kind of transcoding software on your server (where you have Icecast installed and running), for example liquidosap (https://www.liquidsoap.info/). Liquidsoap can (among other things) take your stream as an input and transcode it to sereveral formats, for example - mp3, aac, ogg and then your Icecast server will have several mount points available, for example http://yourserver.com:8000/stream.mp3, http://yourserver.com:8000/stream.ogg, http://yourserver.com:8000/stream.aac and then you can create a small javascript that wil detect browser version and choose suitable stream.
Use HTML5 media player like jPlayer (http://jplayer.org/) or Soundmanager2 (http://www.schillmania.com/projects/soundmanager2/). These players can automatically detect browser version and select suitable stream type, also if they can't play the stream using HTML5 <audio> tag, they will fall back to internal Flash based player.
The most advanced way is to combine (2) and (3) methods, that will give you the most browser support.
Supported audio coding formats
I'm trying to stream a (WebM or MP4) video from Node.js to HTML5 using websockets (the websocket library is Socket.IO on both server and client). The browser in use is the latest version of Chrome (version 26.0.1410.64 m).
I saw here that it's possible to push a video stream in the video tag from a file using the MediaSource object.
My idea is to read chunks of data from the websocket instead of a file.
Can someone please post an example using websockets to accomplish that or explain me how to do it?
Thanks in advance.
In addition to the text (string) messages, the WebSocket API allows you to send
binary data, which is especially useful to implement binary protocols. Such binary
protocols can be standard Internet protocols typically layered on top of TCP, where the
payload can be either a Blob or an ArrayBuffer.
// Send a Blob
var blob = new Blob("blob contents");
ws.send(blob);
// Send an ArrayBuffer
var a = new Uint8Array([8,6,7,5,3,0,9]);
ws.send(a.buffer);
Blob objects are particularly useful when combined with the JavaScript File API
for sending and receiving files, mostly multimedia files, images, video, and audio.
Also i suggest to see WebRTC (Technology associated with WebSockets) Web Real-Time Communication (WebRTC) is another effort to enhance the communication capabilities of modern web browsers. WebRTC is peer-to-peer technology for the Web. The first applications for WebRTC are real-time voice and video chat. WebRTC is already a compelling new technology for media applications, and there are many available sample applications online that enable you to test this out with video and audio over the Web. Please check this link