Live stream a single / static audio file using Azure Media Services - javascript

I'm struggling trying to figure out how to implement live streaming of an audio file through Azure Media Services. What I'm trying to do is have a single/static audio file that live streams and repeats when it reaches the end of the file.
The thought is to have a radio station type experience that when the user starts listening to the audio it begins playing at where the file is currently at in the live stream.
I have very limited knowledge with codecs, streaming types, and encoding. That said, I was able to upload my mp3 file to Azure Media Services, encode it using "AAC Good Quality Audio" and am able to play the audio clip. However, I want to enable streaming to ensure the experience I described above.
The last piece of this will be enabled through a responsive website so I would like to enable the stream using HTML 5 so it's playable on all devices that support it (desktop, mobile, tablet, etc). Is there a HTML5/JavaScript player that is able to do this? Flash/Silverlight is not an option since this won't render on mobile nor tablets.
If I can provide any further information please let me know. Most/all of the articles I see about live streaming is about video and I'm struggling to find how to do this with audio. Any help would be greatly appreciated.
Thanks!

There is no such thing as Live Streaming a single audio file. Live streaming as such imposes a live event. Something that is happening and while happening is streaming. And it doesn't matter if it is Audio or Audio + Video or only Video.
Using only Azure Media Services you cannot achieve this goal. You need a process that plays in repeative mode the media and streams it to a live streaming channel of Azure Media Services.
But it would be a rather expensive exercise! For your need, a more cost effective way would be use some linux streaming server on a Linux VM, like http://icecast.org/

You can also send one file, and Azure can transcode it to various formats for you. I have a full list of tutorials I wrote on Azure Media Services:
Intro to HTML5 video
Intro to Azure Media services, AES, and
PlayReady DRM
Live streaming HTML5 video using Azure Media
Services
Using Azure Blob Storage to store & serve your audio and
video files
Use this Azure Media Player for streaming Media Service
video to all devices
Uploading video to Azure Media Services
In terms of the encoding, Azure can do that for you. Give it one file type, and it can create multiple copies of various formats for you. I have a short post and a 10 minute video on how to do that.
There is also an Azure Media Player, and the selling point behind this is that it adapts the video stream based on the device it detects the player is running on, which is nice for pairing with the format changes listed above. So it saves you the trouble of having to write the fallback conditions yourself. (Ex: Running on an iOS device, so use HLS).
You can use any of this for audio, as well. Your best bet is to set set Looping = true on the video player, as both the HTML5 audio and video player has this attribute.
Let me know if you need more info.

Related

Is it possible to save video call via WebRTC or web socket

I am on my way of learning javascript by developing some cool stuff like video conferencing app. I don't have much understanding of this webRTC technology so I am wondering if it is also possible to save video call to the server which has been taken place on webRTC based video conferencing app ?
Yes, it is possible. You can record each stream to blobs and push them to your server with websocket. You can then convert the blobs to a webm file.
Demo: video https://webrtc.github.io/samples/src/content/getusermedia/record/

I want to use azure media services for video recording and streaming

I want to record the video by using webcam in javascript and stream this video using azure media services. I have gone through the media service document but not able to solve my issue. I don't want to upload the video directly to the azure portal. Can anyone help me to record video in javascript and stream using azure media services.
Streaming directly from a browser would require a protocol like WebRTC for ingesting. We do not currently support WebRTC for live ingest streaming. Today we only support RTMP or Smooth Streaming (Fragmented MP4 Ingest).
For details on how to do live streaming see https://learn.microsoft.com/azure/media-services/latest/live-streaming-overview
For a list of supported encoders see https://learn.microsoft.com/en-us/azure/media-services/latest/recommended-on-premises-live-encoders

Video streaming to browser on iOS

I have implemented video streaming from a Java server to a website using WebSockets and Media Source Extensions (JavaScript). This works fine for nearly every browser on several operating systems except iOS. I am aware of the fact that MSE is not supported on iOS (yet).
Is there any way to easily enalbe video streaming for iOS clients using the same (already existing) technology via web sockets?
I think of something similar to Media Source Extensions, so that I just have to reimplement the client side.
My workflow is:
Create a HTML5 video element and Media Source
Create a new web socket and request video data from the server
Transcode video using FFmpeg and stream the result to stdout
Send the binary video data in chunks to the client
Add the video binary data to the source buffer of the HTML5 <video> element which is linked to a MediaSource with a SourceBuffer.
Hoping for any advice.
If needed, you can use the <video> tag. Look under "Provide Alternate Sources", you can use a HTTP live stream.

Obtaining audio data from android microphone using Cordova and JavaScript (without getUserMedia)

I need to obtain frequency/pitch data from the microphone of an android device on the fly using JavaScript.
I have done this for desktop/laptop browsers with getUserMedia and Web Audio API, but these are not supported on the vast majority of Android devices.
I have tried using the cordova-plugin-media-capture however this opens an audio recorder which the user can then save or discard, and after saving you can use cordova-plugin-file to obtain the data as shown here: https://stackoverflow.com/a/32097634/5674976 but I need it not to open the audio recorder, and instead perhaps just a record button, and once it is recording to provide the audio data immediately (so that it can detect the frequency data in real-time).
I have seen recording functionality in place e.g. WhatsApp, Facebook Messenger etc. and so as a last resort - since I do not know Java - would it be possible to create a plugin using Java for Cordova?
edit: I have also looked at cordova-plugin-media https://github.com/apache/cordova-plugin-media which seems to provide amplitude data and current position data. I'm thinking I could figure out frequency by looking at the amplitude over time, or am I being naive?
I managed to record audio and also analyze the frequency without either getUserMedia or Web Audio API for Android.
Firstly I installed the cordova-plugin-audioinput plugin, which outputs a stream of audio samples (from the microphone), with custom configurations such as buffer size and sample rate. You can then use this data to detect specific frequencies.

JWPlayer autodetect best quality for each user without using flash player(HLS, .m3u8) for other browsers?

My videos are hosted on Amazon S3. And am using JW Player7 javascript version.
Amazon has a tool named Elastic Transcoder and via this I transcoded videos with different qualities (1080p,720p,etc...) to .m3u8 format for HLS stream. And now it autodetects well which is the best quality for users.
But it uses flash player to render the video. And flash isn't supported in Mozilla,IE,Opera if you don't have Adobe Flash Player extension installed in your browser?
I want to know is it possible to autodetect right quality with HTML5 player not via Flash?
JWPlayer runs in the browser and is one of many video players that can be used to play video files in a browser, such as those video files created by Elastic Transcoder. JWPlayer, not Elastic Transcoder, is autodetecting the best video format for the user, based on things like the browser version and the presence of a Flash plugin. JWPlayer supports HTML5.
If you wish to support HTML5 video then you need to configure Elastic Transcoder to generate HTML5 compatible video files (MP4 and WebM) and then add the URL of your HTML5 compatible video files to the list of video sources in the JWPlayer configuration.

Categories

Resources