How to detect realtime webstreamed video frozen at webrtc using javascript? - javascript

when I was creating the videoCall App Using webrtc, the issue was when a remote streamed
video disconnects due to a network issue, the video was frozen,
How to detect real-time web-streamed video frozen at Webrtc?

Related

Interacting with audio from entire device with a PWA

I am making a PWA and I wanted to interact with audio from sources other than my PWA for visualization and equalization of the sound, I was wondering if anyone knew any leads for how to interact with the system audio?

Real time audio web to desktop application

i´m trying to send real time audio messages from a web site to a desktop applicacion (C#) .Its for a public address system, so desktop applicacion just recives audio stream and plays it through the speakers.
it's posible? what can I use for that purpose?
WebRTC can be used for that?
Thanks.
WebRTC used for browser based peer to peer communication. As per your requirement you can use WebRTC for real time audio messaging and can use Electron for user desktop app.Other approach would be can use Socket.IO for RTC for audio messaging.

Using Webrtc and Webspeech API simultaneously

Currently I'm using Webrtc to stream both video and audio in Chrome. At the same time I'm using Chrome's Webspeech API to get the audio into text. This work's fine in desktop Chrome but when I try this on Android Chrome, the Webspeech API's onresult function doesn't fire if the Webrtc is streaming both video and audio.
Is there a way to have Webrtc streaming both audio and video and have Webspeech recognizing speech at the same time?
Thanks

Setting the audio Source in webrtc to speaker instead of mic

I am writing webrtc app and I want to set the audio source while sharing screen to be the speaker "the internal audio of the machine" instead of sharing the mic. I am using a firefox as a browser.

Keep alive webRTC audio stream on Android

I have built a webRTC application that streams audio.
The application works as intended on all devices and the client is able to hear the audio stream. At a very high level, the RTC stream is simply attached to an audio element which works great.
The problem: I am trying to utilize the Android Chrome background audio feature. At the moment the stream keeps playing in the background (even when chrome is minimized) however about 5 seconds after screen timeout/lock, the peer connection is closed. This is not a memory issue (I have several test devices including a Galaxy S7).
In contrast if I simply point to url of an mp3 file, the audio context will keep playing indefinitely. Is there a way to achieve this indefinite background with a webRTC stream?
Cheers in advance!
Looks like this old bug made its way back into Chromium :
https://bugs.chromium.org/p/chromium/issues/detail?id=951418
Verified resolved in issue 513633 with no background logic required: https://bugs.chromium.org/p/chromium/issues/detail?id=513633

Categories

Resources