I want to stream my webcam to my website. Maybe inside < video >, < embed >, or even < img > tag.
What would be the best way? I've tried VLC-player and webrtc.
The problem with VLC is that I cannot manage to stream outside my LAN. Inside my LAN it is working fine, but I cannot connect through the internet. And I also don't know how to implement it to my website.
The problem with webrtc is that it is not supported by ios. It works fine on windows and android, but is failing to run on iphones and ipads (I didn't tried it on mac, but I think it will work).
Maybe there is some code in javascript, html or something else?
For iOS devices, webrtc is not fully supported. However, you CAN build the Native C/C++ API for them and build a locally running applications that can access your stream(not from a webpage but currently your only option on iOS).
OSX, Windows, and Android's versions of Chrome should all work well together(I have tested this myself and have gotten streams to work between all those devices without issue) and access you stream.
For grabbing your camera's stream, you could grab it through chrome, continually stream and dynamically handle peer connections as they come in from your signalling server(you can use the same local stream for numerous peer connections). You could also use a natively built application using the WebRTC API but that will be more complex than manipulating your stream in chrome.
Related
I’m currently working on a web application whose main purpose is streaming/timeshifting TV channels. Application is written in Javascript React framework and for web player we are using CookPete ReactPlayer with integrated hls.js. We have managed to successfully play live TV channels but unfortunately, we are experiencing some issues with timeshifting channels
Live streams are distributed over XtreamUI server as a m3u8 lists, and have this kind of format
example.org/live/username/password/channel_1.m3u8
So when a user is watching Live TV this kind of URL goes to the player source and CookPete player + hls.js are doing their magic with parsing/processing m3u8 list which results in playing video flawlessly.
Here comes the problem, for timeshift XtreamUI are using this kind of URL example.org/streaming/timeshift.php?username=XXX&password=XXX&stream=2&start=2020-04-26:19-23&duration=7
As you can see its PHP script which STREAMS raw bytes into the player. Here are response headers from /streaming/timeshift.php
As you can notice, the Content-type is video/mp2t which for some reason cannot be played in the browser environment.( Google Chrome, Mozilla Firefox, IE 11). This warning pops up.
On the other hand, Safari browser on Mac video is playing completely normal, but the request from Safari is a little bit different. This is a screenshot from Safari's console network tab. As you can see there are several requests with different byte-ranges.
We are seeking a solution which will provide playing timeshift video (video/mp2t content) in Google Chrome, Mozilla Firefox and IE 11. All suggestions/advices are welcome.
the Content-type is video/mp2t which for some reason cannot be played in the browser environment
This is because chrome and Firefox do not support mpeg transport streams, and safari does. hls.js works because it knows how to read a binary ts file, and rewrite it as mp4 fragment before sending to the the media source extensions buffer. You will need to do the same. Take a look at mux.js.
In Firefox install a codec by running this command.
sudo apt-get install libavcodec58
Install a similar codec in chrome too. It should fix the problem.
I am attempting to record audio samples from user on a PWA app. This application runs great on my Mac with the blobs being full in data. There is no problem in running the audio on the Mac. When running the PWA on my phone, blobs have 0 size, and nothing happens when running the audio.
React.js is used for the PWA. Before updating my chrome to the latest version(78.0.1304.108) the recording consisted of no problems. After the upgrade, attempts to find a solution failed. Phones with older versions work.
The following packs and JavaScript libraries were used (React-Mic,#cleandersonlobo/react-mic,React-Audio-Recorder),(getUserMedia,MediaRecorder)
Ok so I solved this issue, it turns out I used all audio inputs stream that chrome allows and because chrome dosent display an error regarding this it was hard to find it, so all I did was just redirecting the input streams better.
I'm making a platform which involves the server triggering Audio to be played with Socket.io. I'm trying to make a PWA for iOS using similar code, but it just doesn't work and I've heard that Safari requires user interaction (I am using this with the full screen PWA meta tag, not through Safari directly). It works fine on the latest version of Chrome on desktop.
Is there anyway to make this work?
Web Interface (uses same audio
playing and socket.io code): https://github.com/archiebaer/bithop-web-interface
No, Safari restricts audio playing quite a lot. It seems in your scenario, there is no work around unfortunately.
i need to send a live streaming from pc to pc , both of them using just the web browser (IE, firefox o chrome), exist a library (javascript) that could help me to push the stream from the sender to the media server (ffmpeg-ffserver, wowza, etc).
I guess you want to stream a video signal from the webcam. Then the way to go is to use webRTC, but it is still very new (wowza server just started to support it) and it is only supported in some modern browsers. So you will encounter many issues.
Most of the existing solution still use flash to capture from the webcam and encode in rtmp.
We are working on an IP camera Android app that should stream the video took in real-time by the Android camera to a Web page served by the same app and accessed through WiFi only.
The app currently use a pseudo-streaming method (an image sent using HTTP with no-store), but it is not robust enough, so we need to change it for a better streaming method. We also need to support multicast (or at least an optimized "multi-unicast"), and if possible use an UDP protocol (or at least a low-latency TCP protocol).
We cannot use any intermediary server (so no Wowza or the like, unless it is also served by the app) or any browser plugin (so no VLC or the like, unless it is served by the app too). The main browser it is used on is Chromium.
We searched for and tried a lot of methods but none worked for us :
WebRTC sounds cool, but it uses an intermediary signaling server, it doesn't support multicast, and it is kind of heavy for what we want
RTSP with libstreaming sounds cool too, but no browser seems to implement it, and we couldn't find a Javascript library to do it.
RTMP works on most browsers, but we could'nt find a working Android library
Which streaming method would be best for our needs, and do you know Javascript and Android libraries implementing them ?
There is no way to stream multicast to a browser.