Show Audio of Server Side Sound Card - javascript

Good Day,
I have a USB Audio device connected to my web server (RPi) running linux, apache2 and php.
I would like to display the sound level from the mic connected to the server via a webpage. I came across this html5 example which uses the:
navigator.getUserMedia
However this is for a microphone on the client side.
Can someone advise how I could achieve a similar page that instead displays the
level of the server side audio.
Thanks,

Related

How to stream both pc sound and incoming sound over a network

Hey guys am trying to create a streaming system using javascript where the system can receive both sound playing from my pc and sound entering my pc like a mic and then streaming both sounds live on the browser allowing multiple users listen to it. this is more like a radio brocasting session where you can play sound and also talk same time and the listener gets both sound without any latency or if any it shouldn't be noticed at all is this possible to be achieve using javascript - node js and react? if yes any guide at all with help thank you.

How to allow multiple audio stream to multiple listeners

How can i allow multiple users to stream audio coming from their machine over the network for multiple listeners i mean taking all the sound from their soundcard to the network,
I know this can be accomplish using icecast, edcast etc. but that will be only when the user installs these program to their device and start making configurations and a lot of work.
what i need is if its possible to do this without icecast just javascript, if we use webrtc it will be more like a voice call i guess,
but i need that audio streamed from device A to device B as if it was already in device B, am talking about playing music on the device and sound from mic at same time. is this possible with javascript? and multiple users can do this stream at same time.
taking all the sound from their soundcard to the network [...] without icecast just javascript
There is no Web API for capturing all sound the computer plays that I know of.
You could maybe make something like this with WebRTC or other web APIs if the user's sound drivers expose a recording device (like the "Stereo Mix" recording device of olden days) that the user selects for your web app to use.
(As an aside, Icecast itself doesn't care where the audio comes from, it just accepts and redistributes OGG or MP3 streams. It's up to the casting client to figure out where the audio comes from.)

How can I enable fast file transfers across devices connected over the same network?

I am building a pdf making solution in which a user captures photos from his camera, and s/he can make a pdf out of those photos. Now my client wants a feature where you can share photos captured in one device to another device (like if the user is capturing photos on a mobile device and he now wants those photos on another device maybe a computer). The whole app is a progressive web app, so pretty much all functionalities run offline. I thought about building a web server, and upload photos on prompt from the mobile device, and may be provide some way for the pc to connect and download those files. But turns out it gets too slow..
But then I thought there are some web apps, which allow sharing large files after connecting to the same wifi network, how do they do that? For now that seems like the solution for my problem. Can anyone help?

Is it possible to save video call via WebRTC or web socket

I am on my way of learning javascript by developing some cool stuff like video conferencing app. I don't have much understanding of this webRTC technology so I am wondering if it is also possible to save video call to the server which has been taken place on webRTC based video conferencing app ?
Yes, it is possible. You can record each stream to blobs and push them to your server with websocket. You can then convert the blobs to a webm file.
Demo: video https://webrtc.github.io/samples/src/content/getusermedia/record/

Real time audio web to desktop application

i´m trying to send real time audio messages from a web site to a desktop applicacion (C#) .Its for a public address system, so desktop applicacion just recives audio stream and plays it through the speakers.
it's posible? what can I use for that purpose?
WebRTC can be used for that?
Thanks.
WebRTC used for browser based peer to peer communication. As per your requirement you can use WebRTC for real time audio messaging and can use Electron for user desktop app.Other approach would be can use Socket.IO for RTC for audio messaging.

Categories

Resources