I am making a PWA and I wanted to interact with audio from sources other than my PWA for visualization and equalization of the sound, I was wondering if anyone knew any leads for how to interact with the system audio?
Related
when I was creating the videoCall App Using webrtc, the issue was when a remote streamed
video disconnects due to a network issue, the video was frozen,
How to detect real-time web-streamed video frozen at Webrtc?
Hey guys am trying to create a streaming system using javascript where the system can receive both sound playing from my pc and sound entering my pc like a mic and then streaming both sounds live on the browser allowing multiple users listen to it. this is more like a radio brocasting session where you can play sound and also talk same time and the listener gets both sound without any latency or if any it shouldn't be noticed at all is this possible to be achieve using javascript - node js and react? if yes any guide at all with help thank you.
I have built a webRTC application that streams audio.
The application works as intended on all devices and the client is able to hear the audio stream. At a very high level, the RTC stream is simply attached to an audio element which works great.
The problem: I am trying to utilize the Android Chrome background audio feature. At the moment the stream keeps playing in the background (even when chrome is minimized) however about 5 seconds after screen timeout/lock, the peer connection is closed. This is not a memory issue (I have several test devices including a Galaxy S7).
In contrast if I simply point to url of an mp3 file, the audio context will keep playing indefinitely. Is there a way to achieve this indefinite background with a webRTC stream?
Cheers in advance!
Looks like this old bug made its way back into Chromium :
https://bugs.chromium.org/p/chromium/issues/detail?id=951418
Verified resolved in issue 513633 with no background logic required: https://bugs.chromium.org/p/chromium/issues/detail?id=513633
I have a big challenge.
Currently searched on the internet, content that could help me with this question: I need to develop a Multitrack Player as those of music recording studios.
The idea is to make 3 or more MP3 files play at the same time, simultaneously and synchronized like a player example of this page: http://www.multitracks.com/songs/Steffany-Gretzinger/The-Undoing/Constant-One/
Ok. I have used the <audio> of html5. 0 of latency on a user of PC. But when testing realized on a cell phone, audio is out of sync. For there is a minimum latency of 1 second or longer to begin each tag <audio>.
I have also tested Web Audio API and derivatives. This API is amazing. There are several ways of developing what I want with it however, the support is only for users with CHROME.
So the challenge is this. How do I develop a Multitrack Player as the sample page where audios touch without latency and the entire script development is compatible with all browsers?
I want to make a very simple web app that takes the audio signal from the phone's microphone and varies some simple graphics on the page in as near to real-time as possible.
I'd prefer the app to be web-based (rather than iOS/Android apps) so that users can be directed to the page in a more rapid way. (I.e. go to sh.ort/url and the web app starts straight away)
Is this currently possible?
I'm not sure how well supported the web audio API is, or some of the more modern HTML5 features.
Thanks
This gives you a nice tutorial on how to record audio through the HTML5 API: http://www.html5rocks.com/en/tutorials/getusermedia/intro/
Unfortunately, as seen in this post, Apple doesn't support any API for this yet: Mobile Safari Audio Recording from Microphone