I am trying to setup live internet radio station using icecast server and want my stream to work in all modern browsers. My client only produces .ogg stream. All browsers doesn't play .ogg stream. For eg the .ogg stream I have setup works in chrome but doesn't work in IE. How should I make it run in all browsers?
Is there a way I can convert .ogg stream to .mp3 or any other format on the fly
Embed a audio player in the browser which can play .ogg stream.
Or Any other advice would be helpful.
Regards,
Hitesh Bhutani
You have several options:
Change encoding format from OGG to MP3 in your Virtual DJ software. Keep in mind that Firefox will not be able to play mp3 streams on some platforms using HTML5 audio tag due to licensing limitations.
Install some kind of transcoding software on your server (where you have Icecast installed and running), for example liquidosap (https://www.liquidsoap.info/). Liquidsoap can (among other things) take your stream as an input and transcode it to sereveral formats, for example - mp3, aac, ogg and then your Icecast server will have several mount points available, for example http://yourserver.com:8000/stream.mp3, http://yourserver.com:8000/stream.ogg, http://yourserver.com:8000/stream.aac and then you can create a small javascript that wil detect browser version and choose suitable stream.
Use HTML5 media player like jPlayer (http://jplayer.org/) or Soundmanager2 (http://www.schillmania.com/projects/soundmanager2/). These players can automatically detect browser version and select suitable stream type, also if they can't play the stream using HTML5 <audio> tag, they will fall back to internal Flash based player.
The most advanced way is to combine (2) and (3) methods, that will give you the most browser support.
Supported audio coding formats
Related
I'm trying to understand the limitations of safari ios (12.3.1) for recording audio in the browser. This is because any speech I record - regardless of the codec / container - is much lower quality than equivalent audio recorded via a native ios app.
Opus, wav and mp3 files recorded within safari ios (iphone8) seem capped at a certain quality, regardless of the encoder settings (eg bitrate / complexity / samplerate / channels). But when recording speech within a native app, I find I can generate consistently excellent results.
The audio quality within safari is unaffected by: recording audio using different javascript recorders (wav only) and javascript encoders (opus and mp3); clearing the browser cache; reloading js from a private browser window; switching on MediaRecorder API experimental feature. But
I've spent the last few days playing with getUserMedia() on various devices. After analyzing the WAV files produced (waveform & spectrum analysis), here are the few infos I've been able to get from them.
Safari iOS will produce a decent WAV file (default is 48000Khz, 16 bit, Stereo), but no matter the constraints passed to getUserMedia() the audio spectrum will not contain any frequency higher than 14Khz.
So the WAV container is OK, but the quality of the audio written to it is about the same as the "medium" audio preset that can be found in native iOS applications.
iOS wave file spectrum
The consequence of this is a very pronounced "telephone" effect on the audio, and a file that is unusable for professional audio purposes.
Android devices produce a similar result in the default configuration (strong limitation of the audio spectrum), but by passing a set of constraints that disable the assistances like these :
autoGainControl:false,
echoCancellation:false,
noiseSuppression:false
we can achieve a very acceptable audio quality.
android 8 wave file spectrum
Unfortunately, these same settings do not allow iOS to achieve acceptable audio quality and as of now there seems to be no workaround available.
I have implemented video streaming from a Java server to a website using WebSockets and Media Source Extensions (JavaScript). This works fine for nearly every browser on several operating systems except iOS. I am aware of the fact that MSE is not supported on iOS (yet).
Is there any way to easily enalbe video streaming for iOS clients using the same (already existing) technology via web sockets?
I think of something similar to Media Source Extensions, so that I just have to reimplement the client side.
My workflow is:
Create a HTML5 video element and Media Source
Create a new web socket and request video data from the server
Transcode video using FFmpeg and stream the result to stdout
Send the binary video data in chunks to the client
Add the video binary data to the source buffer of the HTML5 <video> element which is linked to a MediaSource with a SourceBuffer.
Hoping for any advice.
If needed, you can use the <video> tag. Look under "Provide Alternate Sources", you can use a HTTP live stream.
I'm struggling trying to figure out how to implement live streaming of an audio file through Azure Media Services. What I'm trying to do is have a single/static audio file that live streams and repeats when it reaches the end of the file.
The thought is to have a radio station type experience that when the user starts listening to the audio it begins playing at where the file is currently at in the live stream.
I have very limited knowledge with codecs, streaming types, and encoding. That said, I was able to upload my mp3 file to Azure Media Services, encode it using "AAC Good Quality Audio" and am able to play the audio clip. However, I want to enable streaming to ensure the experience I described above.
The last piece of this will be enabled through a responsive website so I would like to enable the stream using HTML 5 so it's playable on all devices that support it (desktop, mobile, tablet, etc). Is there a HTML5/JavaScript player that is able to do this? Flash/Silverlight is not an option since this won't render on mobile nor tablets.
If I can provide any further information please let me know. Most/all of the articles I see about live streaming is about video and I'm struggling to find how to do this with audio. Any help would be greatly appreciated.
Thanks!
There is no such thing as Live Streaming a single audio file. Live streaming as such imposes a live event. Something that is happening and while happening is streaming. And it doesn't matter if it is Audio or Audio + Video or only Video.
Using only Azure Media Services you cannot achieve this goal. You need a process that plays in repeative mode the media and streams it to a live streaming channel of Azure Media Services.
But it would be a rather expensive exercise! For your need, a more cost effective way would be use some linux streaming server on a Linux VM, like http://icecast.org/
You can also send one file, and Azure can transcode it to various formats for you. I have a full list of tutorials I wrote on Azure Media Services:
Intro to HTML5 video
Intro to Azure Media services, AES, and
PlayReady DRM
Live streaming HTML5 video using Azure Media
Services
Using Azure Blob Storage to store & serve your audio and
video files
Use this Azure Media Player for streaming Media Service
video to all devices
Uploading video to Azure Media Services
In terms of the encoding, Azure can do that for you. Give it one file type, and it can create multiple copies of various formats for you. I have a short post and a 10 minute video on how to do that.
There is also an Azure Media Player, and the selling point behind this is that it adapts the video stream based on the device it detects the player is running on, which is nice for pairing with the format changes listed above. So it saves you the trouble of having to write the fallback conditions yourself. (Ex: Running on an iOS device, so use HLS).
You can use any of this for audio, as well. Your best bet is to set set Looping = true on the video player, as both the HTML5 audio and video player has this attribute.
Let me know if you need more info.
How to capture a video and store it in h.264 format using web browsers without flash?
As per my analysis, HTML5 can be used to access the camera and microphone without any additional plug-ins as in this link. "http://www.html5rocks.com/en/tutorials/getusermedia/intro/"
However, i don't know the possibility of storing the video in h.264 format.
I would prefer not to install any additional plugins. However if its not possible without any additional plugins, i would accept it. I can also use **Java Applets if needed.
Is HTML 5 the only option or it possible with JScript itself? I also hope that ffmpeg can help me too.
3 years later it is now possible to record H.264 video in the browser with Chrome 52 (Jul 20, 2016) and the Media Recorder API without plugins or Java or Flash.
Chrome 52 is the 1st to support both the Media Recorder API and H.264 for video encoding. Chrome 49,50 and 51 only supported VP8 & VP9 as video codecs and Chrome 48- did not the support Media Recorder API.
The audio codec used is still Opus # 48kHz and the container is .webm so if you'll still have to pass the file through FFmpeg if you want wide browser/device support.
This article (co-wroted by me) covers the Media Recorder API in detail across Firefox and Chrome and there's this Media Recorder API demo that supports H.264 encoding on Chrome 52 + the associated code as a GitHub project.
Disclaimer: I work at Pipe where we handle video recording.
Not yet, but it looks like it's coming.
There is a draft specification for MediaStream Recording, but it has not been implemented in any browsers yet. There is a ticket to build it into Chrome, that you can track.
Someone built a Javascript library to record video into WebM, though it doesn't do audio. It uses the browser's built-in function to save a WebP image to do the encoding. In theory, you could write an h.264 encoder in Javascript, but it would be very slow and quite difficult to write.
I can't find a way to play back audio in any recent browser, using javascript, especially if I want to be able to distribute an audio encoder to create the audio files (which forbids mp3 format). The audio files could be either on the server or stored locally. The issues are:
Using the HTML5 audio tag, with a flash fallback libraries (such as jplayer, soundmanager2 or projekktor) is easy but the only format that would work anywhere is mp3.
I could use speex/flv but it would only work with flash, forbidding iphone/ipad.
A plain wav file would not work on IE 8, 7, 6
A reasonable list of recent browser to be supported is: IE 8-9, FF 4, Chrome 11 and safari 4, including on iphone/ipad.
The only solutions I would see would be to implement a wav file player in flash, as suggested here (but I would have to learn how to program actionscript), or to encode the files in a free format, upload them, and re-incode them to MP3 on the server (but lot of server CPU load).
Do you know any solution without these drawbacks?