I'm making a chrome app and I'd like to have custom controls for the video playback but I'm having some difficulties with the mute button. Most of the videos that will be played in the app are silent so I'd like to be able to disable the button when there is no audio track just like chrome does with the default controls.
Tried using the volume value but it returns "1" even though there's no audio track. Checking if the video is muted didn't work either.
Here's a snippet.
Any suggestions?
Shorter function based on upuoth's answer and extended to support IE10+
function hasAudio (video) {
return video.mozHasAudio ||
Boolean(video.webkitAudioDecodedByteCount) ||
Boolean(video.audioTracks && video.audioTracks.length);
}
Usage:
var video = document.querySelector('video');
if(hasAudio(video)) {
console.log("video has audio");
} else{
console.log("video doesn't have audio");
}
At some point, browsers might start implementing the audioTracks property. For now, you can use webkitAudioDecodedByteCount for webkit, and mozHasAudio for firefox.
document.getElementById("video").addEventListener("loadeddata", function() {
if (typeof this.webkitAudioDecodedByteCount !== "undefined") {
// non-zero if video has audio track
if (this.webkitAudioDecodedByteCount > 0)
console.log("video has audio");
else
console.log("video doesn't have audio");
}
else if (typeof this.mozHasAudio !== "undefined") {
// true if video has audio track
if (this.mozHasAudio)
console.log("video has audio");
else
console.log("video doesn't have audio");
}
else
console.log("can't tell if video has audio");
});
For some reason #fregante's hasAudio function stopped working in Chrome at some point - even after waiting for the "loadeddata" and "loadedmetadata" events, and even the "canplaythrough" event. It may have something to do with the video format I'm using (webm). In any case, I solved it by playing the video for a short amount of time:
// after waiting for the "canplaythrough" event:
hasAudio(video); // false
video.play();
await new Promise(r => setTimeout(r, 1000));
video.pause();
hasAudio(video); // true
There are different ways to check whether a video file has audio or not, one of which is to use mozHasAudio, video.webkitAudioDecodedByteCount and video.audioTracks?.length properties of video, clean and simple...
const video = component.node.querySelector("video");
video.onloadeddata = function() {
if ((typeof video.mozHasAudio !== "undefined" && video.mozHasAudio) ||
(typeof video.webkitAudioDecodedByteCount !== "undefined" && video.webkitAudioDecodedByteCount > 0) ||
Boolean(video.audioTracks?.length)) {
console.log("This video has audio tracks.");
} else {
console.log("This video has no audio tracks.");
}
};
Related
I'm making a chrome app and I'd like to have custom controls for the video playback but I'm having some difficulties with the mute button. Most of the videos that will be played in the app are silent so I'd like to be able to disable the button when there is no audio track just like chrome does with the default controls.
Tried using the volume value but it returns "1" even though there's no audio track. Checking if the video is muted didn't work either.
Here's a snippet.
Any suggestions?
Shorter function based on upuoth's answer and extended to support IE10+
function hasAudio (video) {
return video.mozHasAudio ||
Boolean(video.webkitAudioDecodedByteCount) ||
Boolean(video.audioTracks && video.audioTracks.length);
}
Usage:
var video = document.querySelector('video');
if(hasAudio(video)) {
console.log("video has audio");
} else{
console.log("video doesn't have audio");
}
At some point, browsers might start implementing the audioTracks property. For now, you can use webkitAudioDecodedByteCount for webkit, and mozHasAudio for firefox.
document.getElementById("video").addEventListener("loadeddata", function() {
if (typeof this.webkitAudioDecodedByteCount !== "undefined") {
// non-zero if video has audio track
if (this.webkitAudioDecodedByteCount > 0)
console.log("video has audio");
else
console.log("video doesn't have audio");
}
else if (typeof this.mozHasAudio !== "undefined") {
// true if video has audio track
if (this.mozHasAudio)
console.log("video has audio");
else
console.log("video doesn't have audio");
}
else
console.log("can't tell if video has audio");
});
For some reason #fregante's hasAudio function stopped working in Chrome at some point - even after waiting for the "loadeddata" and "loadedmetadata" events, and even the "canplaythrough" event. It may have something to do with the video format I'm using (webm). In any case, I solved it by playing the video for a short amount of time:
// after waiting for the "canplaythrough" event:
hasAudio(video); // false
video.play();
await new Promise(r => setTimeout(r, 1000));
video.pause();
hasAudio(video); // true
There are different ways to check whether a video file has audio or not, one of which is to use mozHasAudio, video.webkitAudioDecodedByteCount and video.audioTracks?.length properties of video, clean and simple...
const video = component.node.querySelector("video");
video.onloadeddata = function() {
if ((typeof video.mozHasAudio !== "undefined" && video.mozHasAudio) ||
(typeof video.webkitAudioDecodedByteCount !== "undefined" && video.webkitAudioDecodedByteCount > 0) ||
Boolean(video.audioTracks?.length)) {
console.log("This video has audio tracks.");
} else {
console.log("This video has no audio tracks.");
}
};
I am trying to implement, the video on/off toggle for a webRtc application in react, so far i am able to stop the video using
userStream.getVideoTracks()[0].stop()
but can't seem to find any function to restart the video track .
I have tried the .enable method
userStream.getVideoTracks()[0].enabled = !userStream.getVideoTracks()[0]
but using this still leaves the webcam light on, which in undesirable but gets the functionality working,
on the other hand userStream.getVideoTracks()[0].stop() turns off the light but i am not able start it back.
Is there anyway to achive this without creating a new stream.
When you use track.stop() you can't reuse the track. You'll have to create a new one.
With the track.enabled method it should normally get the functionality that you're looking for. Disabling the camera indicator when disabled. Because as the official docs state:
If the MediaStreamTrack represents the video input from a camera, disabling the track by setting enabled to false also updates device activity indicators to show that the camera is not currently recording or streaming. For example, the green "in use" light next to the camera in iMac and MacBook computers turns off while the track is muted in this way.
https://developer.mozilla.org/en-US/docs/Web/API/MediaStreamTrack/enabled
It could be another track is still using your track or it could be something browser version related why your camera indicator is still on.
best way to do is you can replace tracks.
function replaceTracks(elementId, newStream, localStream, peerConnection) {
detachMediaStream(elementId);
newStream.getTracks().forEach(function (track) {
localStream.addTrack(track);
});
attachMediaStream(elementId, newStream);
// optionally, if you have active peer connections:
_replaceTracksForPeer(peerConnection);
function _replaceTracksForPeer(peer) {
console.log(peer)
peer.getSenders().map(function (sender) {
sender.replaceTrack(newStream.getTracks().find(function (track) {
return track.kind === sender.track.kind;
}));
});
}
function attachMediaStream(id, stream) {
var elem: any = document.getElementById(id);
if (elem) {
if (typeof elem.srcObject === 'object') {
elem.srcObject = stream;
} else {
elem.src = window.URL.createObjectURL(stream);
}
elem.onloadedmetadata = function (e) {
elem.play();
};
} else {
throw new Error('Unable to attach media stream');
}
}
function detachMediaStream(id) {
var elem;
elem = document.getElementById(id);
if (elem) {
elem.pause();
if (typeof elem.srcObject === 'object') {
elem.srcObject = null;
} else {
elem.src = '';
}
}
}
}
I opened a webcam by using the following JavaScript code:
const stream = await navigator.mediaDevices.getUserMedia({ /* ... */ });
Is there any JavaScript code to stop or close the webcam?
Since this answer has been originally posted the browser API has changed.
.stop() is no longer available on the stream that gets passed to the callback.
The developer will have to access the tracks that make up the stream (audio or video) and stop each of them individually.
More info here: https://developers.google.com/web/updates/2015/07/mediastream-deprecations?hl=en#stop-ended-and-active
Example (from the link above):
stream.getTracks().forEach(function(track) {
track.stop();
});
Browser support may differ.
Previously, navigator.getUserMedia provided you with a stream in the success callback, you could call .stop() on that stream to stop the recording (at least in Chrome, seems FF doesn't like it)
Use any of these functions:
// stop both mic and camera
function stopBothVideoAndAudio(stream) {
stream.getTracks().forEach(function(track) {
if (track.readyState == 'live') {
track.stop();
}
});
}
// stop only camera
function stopVideoOnly(stream) {
stream.getTracks().forEach(function(track) {
if (track.readyState == 'live' && track.kind === 'video') {
track.stop();
}
});
}
// stop only mic
function stopAudioOnly(stream) {
stream.getTracks().forEach(function(track) {
if (track.readyState == 'live' && track.kind === 'audio') {
track.stop();
}
});
}
Don't use stream.stop(), it's deprecated
MediaStream Deprecations
Use stream.getTracks().forEach(track => track.stop())
FF, Chrome and Opera has started exposing getUserMedia via navigator.mediaDevices as standard now (Might change :)
online demo
navigator.mediaDevices.getUserMedia({audio:true,video:true})
.then(stream => {
window.localStream = stream;
})
.catch( (err) =>{
console.log(err);
});
// later you can do below
// stop both video and audio
localStream.getTracks().forEach( (track) => {
track.stop();
});
// stop only audio
localStream.getAudioTracks()[0].stop();
// stop only video
localStream.getVideoTracks()[0].stop();
Suppose we have streaming in video tag and id is video - <video id="video"></video> then we should have following code -
var videoEl = document.getElementById('video');
// now get the steam
stream = videoEl.srcObject;
// now get all tracks
tracks = stream.getTracks();
// now close each track by having forEach loop
tracks.forEach(function(track) {
// stopping every track
track.stop();
});
// assign null to srcObject of video
videoEl.srcObject = null;
Starting Webcam Video with different browsers
For Opera 12
window.navigator.getUserMedia(param, function(stream) {
video.src =window.URL.createObjectURL(stream);
}, videoError );
For Firefox Nightly 18.0
window.navigator.mozGetUserMedia(param, function(stream) {
video.mozSrcObject = stream;
}, videoError );
For Chrome 22
window.navigator.webkitGetUserMedia(param, function(stream) {
video.src =window.webkitURL.createObjectURL(stream);
}, videoError );
Stopping Webcam Video with different browsers
For Opera 12
video.pause();
video.src=null;
For Firefox Nightly 18.0
video.pause();
video.mozSrcObject=null;
For Chrome 22
video.pause();
video.src="";
With this the Webcam light go down everytime...
Try method below:
var mediaStream = null;
navigator.getUserMedia(
{
audio: true,
video: true
},
function (stream) {
mediaStream = stream;
mediaStream.stop = function () {
this.getAudioTracks().forEach(function (track) {
track.stop();
});
this.getVideoTracks().forEach(function (track) { //in case... :)
track.stop();
});
};
/*
* Rest of your code.....
* */
});
/*
* somewhere insdie your code you call
* */
mediaStream.stop();
You can end the stream directly using the stream object returned in the success handler to getUserMedia. e.g.
localMediaStream.stop()
video.src="" or null would just remove the source from video tag. It wont release the hardware.
Since you need the tracks to close the streaming, and you need the stream boject to get to the tracks, the code I have used with the help of the Muaz Khan's answer above is as follows:
if (navigator.getUserMedia) {
navigator.getUserMedia(constraints, function (stream) {
videoEl.src = stream;
videoEl.play();
document.getElementById('close').addEventListener('click', function () {
stopStream(stream);
});
}, errBack);
function stopStream(stream) {
console.log('stop called');
stream.getVideoTracks().forEach(function (track) {
track.stop();
});
Of course this will close all the active video tracks. If you have multiple, you should select accordingly.
If the .stop() is deprecated then I don't think we should re-add it like #MuazKhan dose. It's a reason as to why things get deprecated and should not be used anymore. Just create a helper function instead... Here is a more es6 version
function stopStream (stream) {
for (let track of stream.getTracks()) {
track.stop()
}
}
You need to stop all tracks (from webcam, microphone):
localStream.getTracks().forEach(track => track.stop());
Start and Stop Web Camera,(Update 2020 React es6 )
Start Web Camera
stopWebCamera =()=>
//Start Web Came
if (navigator.mediaDevices && navigator.mediaDevices.getUserMedia) {
//use WebCam
navigator.mediaDevices.getUserMedia({ video: true }).then(stream => {
this.localStream = stream;
this.video.srcObject = stream;
this.video.play();
});
}
}
Stop Web Camera or Video playback in general
stopVideo =()=>
{
this.video.pause();
this.video.src = "";
this.video.srcObject = null;
// As per new API stop all streams
if (this.localStream)
this.localStream.getTracks().forEach(track => track.stop());
}
Stop Web Camera function works even with video streams:
this.video.src = this.state.videoToTest;
this.video.play();
Using .stop() on the stream works on chrome when connected via http. It does not work when using ssl (https).
Please check this: https://jsfiddle.net/wazb1jks/3/
navigator.getUserMedia(mediaConstraints, function(stream) {
window.streamReference = stream;
}, onMediaError);
Stop Recording
function stopStream() {
if (!window.streamReference) return;
window.streamReference.getAudioTracks().forEach(function(track) {
track.stop();
});
window.streamReference.getVideoTracks().forEach(function(track) {
track.stop();
});
window.streamReference = null;
}
The following code worked for me:
public vidOff() {
let stream = this.video.nativeElement.srcObject;
let tracks = stream.getTracks();
tracks.forEach(function (track) {
track.stop();
});
this.video.nativeElement.srcObject = null;
this.video.nativeElement.stop();
}
Have a reference of stream form successHandle
var streamRef;
var handleVideo = function (stream) {
streamRef = stream;
}
//this will stop video and audio both track
streamRef.getTracks().map(function (val) {
val.stop();
});
I am trying to upload video from the users using the media capture interface recently introduced into javascript. Regardless of all the difficulties in browser compatibility, I can't even begin to understand the process of saving the video captured by the users.
I was thinking that I could somehow use ajax to push the streamed video up to the server, but be that as it may, I'm not even sure if I am approaching the problem appropriately.
I included my code, which currently only streams under chrome and opera.
function hasUserMedia()
{
return !!(navigator.getUserMedia ||
navigator.webkitGetUserMedia ||
navigator.mozGetUserMedia ||
navigator.msGetUserMedia);
}
if(hasUserMedia())
{
var onFail = function(error)
{
alert("ERROR >> " + error);
};
var onPass = function(stream)
{
var video = document.querySelector('video');
video.src = window.URL.createObjectURL(stream);
video.onloadedmetadata = function(e)
{
//..what do I put here..?
};
}
navigator.webkitGetUserMedia({video:true, audio:true}, onPass, onFail);
}
else
{
alert("ERROR >> USERMEDIA NOT SUPPORTED");
}
function saveVideo()
{
var connection = new XMLPHttpRequest();
connection.onreadystatechange=function()
{
if(connection.readyState == 4 && connection.status == 200)
{
alert("Your streamed video has been saved..!");
}
}
//..what do I type here..?
connection.open("POST","savevideo.php",true);
connection.send();
}
Currently you can't open a WebRTC connection to a server (though someone may be working on it...). AFAIK the only method is to take a screenshot of each frame by capturing the video to a canvas, then sending each from to the server and compile the video there.
Check out:
https://webrtc-experiment.appspot.com/RecordRTC/
Also may be helpful:
http://www.html5rocks.com/en/tutorials/getusermedia/intro/
I'm trying to accomplish a simple doodle-like behaviour, where a mp3/ogg sound rings on click, using the html tag. It is supposed to work under Firefox, Safari and Safari iPad is very desireable.
I've tried many approaches and have come down to this:
HTML
<span id="play-blue-note" class="play blue" ></span>
<span id="play-green-note" class="play green" ></span>
<audio id="blue-note" style="display:none" controls preload="auto" autobuffer>
<source src="blue.mp3" />
<source src="blue.ogg" />
<!-- now include flash fall back -->
</audio>
<audio id="green-note" style="display:none" controls preload="auto" autobuffer>
<source src="green.mp3" />
<source src="green.ogg" />
</audio>
JS
function addSource(elem, path) {
$('<source>').attr('src', path).appendTo(elem);
}
$(document).ready(function() {
$('body').delegate('.play', 'click touchstart', function() {
var clicked = $(this).attr('id').split('-')[1];
$('#' + clicked + '-note').get(0).play();
});
});
This seems to work great under Firefox but Safari seems to have a delay whenever you click, even when you click several times and the audio file has loaded. On Safari on iPad it behaves almost unpredictably.
Also, Safari's performance seems to improve when I test locally, I'm guessing Safari is downloading the file each time. Is this possible? How can I avoid this?
Thanks!
On desktop Safari, adding AudioContext fixes the issue:
const AudioContext = window.AudioContext || window.webkitAudioContext;
const audioCtx = new AudioContext();
I found out by accident, so I have no idea why it works, but this removed the delay on my app.
I just answered another iOS/<audio> question a few minutes ago. Seems to apply here as well:
Preloading <audio> and <video> on iOS devices is disabled to save bandwidth.
In Safari on iOS (for all devices, including iPad), where the user may
be on a cellular network and be charged per data unit, preload and
autoplay are disabled. No data is loaded until the user initiates it.
Source: Safari Developer Library
The problem with Safari is that it puts a request every time for the audio file being played. You can try creating an HTML5 cache manifest. Unfortunately my experience has been that you can only add to the cache one audio file at a time. A workaround might be to merge all your audio files sequentially into a single audio file, and start playing at a specific position depending on the sound needed. You can create an interval to track the current play position and pause it once it has reached a certain time stamp.
Read more about creating an HTML5 cache manifest here:
http://www.html5rocks.com/en/tutorials/appcache/beginner/
http://www.whatwg.org/specs/web-apps/current-work/multipage/offline.html
Hope it helps!
HTML5 Audio Delay on Safari iOS (<audio> Element vs AudioContext)
Yes, Safari iOS has an audio delay when using the native <audio> Element ...however this can be overcome by using AudioContext.
My code snippet is based on what I learnt from https://lowlag.alienbill.com/
Please test the functionality on your own iOS device (I tested in iOS 12)
https://fiddle.jshell.net/eLya8fxb/51/show/
Snippet from JS Fiddle
https://jsfiddle.net/eLya8fxb/51/
// Requires jQuery
// Adding:
// Strip down lowLag.js so it only supports audioContext (So no IE11 support (only Edge))
// Add "loop" monkey patch needed for looping audio (my primary usage)
// Add single audio channel - to avoid overlapping audio playback
// Original source: https://lowlag.alienbill.com/lowLag.js
if (!window.console) console = {
log: function() {}
};
var lowLag = new function() {
this.someVariable = undefined;
this.showNeedInit = function() {
lowLag.msg("lowLag: you must call lowLag.init() first!");
}
this.load = this.showNeedInit;
this.play = this.showNeedInit;
this.pause = this.showNeedInit;
this.stop = this.showNeedInit;
this.switch = this.showNeedInit;
this.change = this.showNeedInit;
this.audioContext = undefined;
this.audioContextPendingRequest = {};
this.audioBuffers = {};
this.audioBufferSources = {};
this.currentTag = undefined;
this.currentPlayingTag = undefined;
this.init = function() {
this.msg("init audioContext");
this.load = this.loadSoundAudioContext;
this.play = this.playSoundAudioContext;
this.pause = this.pauseSoundAudioContext;
this.stop = this.stopSoundAudioContext;
this.switch = this.switchSoundAudioContext;
this.change = this.changeSoundAudioContext;
if (!this.audioContext) {
this.audioContext = new(window.AudioContext || window.webkitAudioContext)();
}
}
//we'll use the tag they hand us, or else the url as the tag if it's a single tag,
//or the first url
this.getTagFromURL = function(url, tag) {
if (tag != undefined) return tag;
return lowLag.getSingleURL(url);
}
this.getSingleURL = function(urls) {
if (typeof(urls) == "string") return urls;
return urls[0];
}
//coerce to be an array
this.getURLArray = function(urls) {
if (typeof(urls) == "string") return [urls];
return urls;
}
this.loadSoundAudioContext = function(urls, tag) {
var url = lowLag.getSingleURL(urls);
tag = lowLag.getTagFromURL(urls, tag);
lowLag.msg('webkit/chrome audio loading ' + url + ' as tag ' + tag);
var request = new XMLHttpRequest();
request.open('GET', url, true);
request.responseType = 'arraybuffer';
// Decode asynchronously
request.onload = function() {
// if you want "successLoadAudioFile" to only be called one time, you could try just using Promises (the newer return value for decodeAudioData)
// Ref: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/decodeAudioData
//Older callback syntax:
//baseAudioContext.decodeAudioData(ArrayBuffer, successCallback, errorCallback);
//Newer promise-based syntax:
//Promise<decodedData> baseAudioContext.decodeAudioData(ArrayBuffer);
// ... however you might want to use a pollfil for browsers that support Promises, but does not yet support decodeAudioData returning a Promise.
// Ref: https://github.com/mohayonao/promise-decode-audio-data
// Ref: https://caniuse.com/#search=Promise
// var retVal = lowLag.audioContext.decodeAudioData(request.response);
// Note: "successLoadAudioFile" is called twice. Once for legacy syntax (success callback), and once for newer syntax (Promise)
var retVal = lowLag.audioContext.decodeAudioData(request.response, successLoadAudioFile, errorLoadAudioFile);
//Newer versions of audioContext return a promise, which could throw a DOMException
if (retVal && typeof retVal.then == 'function') {
retVal.then(successLoadAudioFile).catch(function(e) {
errorLoadAudioFile(e);
urls.shift(); //remove the first url from the array
if (urls.length > 0) {
lowLag.loadSoundAudioContext(urls, tag); //try the next url
}
});
}
};
request.send();
function successLoadAudioFile(buffer) {
lowLag.audioBuffers[tag] = buffer;
if (lowLag.audioContextPendingRequest[tag]) { //a request might have come in, try playing it now
lowLag.playSoundAudioContext(tag);
}
}
function errorLoadAudioFile(e) {
lowLag.msg("Error loading webkit/chrome audio: " + e);
}
}
this.playSoundAudioContext = function(tag) {
var context = lowLag.audioContext;
// if some audio is currently active and hasn't been switched, or you are explicitly asking to play audio that is already active... then see if it needs to be unpaused
// ... if you've switch audio, or are explicitly asking to play new audio (that is not the currently active audio) then skip trying to unpause the audio
if ((lowLag.currentPlayingTag && lowLag.currentTag && lowLag.currentPlayingTag === lowLag.currentTag) || (tag && lowLag.currentPlayingTag && lowLag.currentPlayingTag === tag)) {
// find currently paused audio (suspended) and unpause it (resume)
if (context !== undefined) {
// ref: https://developer.mozilla.org/en-US/docs/Web/API/AudioContext/suspend
if (context.state === 'suspended') {
context.resume().then(function() {
lowLag.msg("playSoundAudioContext resume " + lowLag.currentPlayingTag);
return;
}).catch(function(e) {
lowLag.msg("playSoundAudioContext resume error for " + lowLag.currentPlayingTag + ". Error: " + e);
});
return;
}
}
}
if (tag === undefined) {
tag = lowLag.currentTag;
}
if (lowLag.currentPlayingTag && lowLag.currentPlayingTag === tag) {
// ignore request to play same sound a second time - it's already playing
lowLag.msg("playSoundAudioContext already playing " + tag);
return;
} else {
lowLag.msg("playSoundAudioContext " + tag);
}
var buffer = lowLag.audioBuffers[tag];
if (buffer === undefined) { //possibly not loaded; put in a request to play onload
lowLag.audioContextPendingRequest[tag] = true;
lowLag.msg("playSoundAudioContext pending request " + tag);
return;
}
// need to create a new AudioBufferSourceNode every time...
// you can't call start() on an AudioBufferSourceNode more than once. They're one-time-use only.
var source;
source = context.createBufferSource(); // creates a sound source
source.buffer = buffer; // tell the source which sound to play
source.connect(context.destination); // connect the source to the context's destination (the speakers)
source.loop = true;
lowLag.audioBufferSources[tag] = source;
// find current playing audio and stop it
var sourceOld = lowLag.currentPlayingTag ? lowLag.audioBufferSources[lowLag.currentPlayingTag] : undefined;
if (sourceOld !== undefined) {
if (typeof(sourceOld.noteOff) == "function") {
sourceOld.noteOff(0);
} else {
sourceOld.stop();
}
lowLag.msg("playSoundAudioContext stopped " + lowLag.currentPlayingTag);
lowLag.audioBufferSources[lowLag.currentPlayingTag] = undefined;
lowLag.currentPlayingTag = undefined;
}
// play the new source audio
if (typeof(source.noteOn) == "function") {
source.noteOn(0);
} else {
source.start();
}
lowLag.currentTag = tag;
lowLag.currentPlayingTag = tag;
if (context.state === 'running') {
lowLag.msg("playSoundAudioContext started " + tag);
} else if (context.state === 'suspended') {
/// if the audio context is in a suspended state then unpause (resume)
context.resume().then(function() {
lowLag.msg("playSoundAudioContext started and then resumed " + tag);
}).catch(function(e) {
lowLag.msg("playSoundAudioContext started and then had a resuming error for " + tag + ". Error: " + e);
});
} else if (context.state === 'closed') {
// ignore request to pause sound - it's already closed
lowLag.msg("playSoundAudioContext failed to start, context closed for " + tag);
} else {
lowLag.msg("playSoundAudioContext unknown AudioContext.state for " + tag + ". State: " + context.state);
}
}
this.pauseSoundAudioContext = function() {
// not passing in a "tag" parameter because we are playing all audio in one channel
var tag = lowLag.currentPlayingTag;
var context = lowLag.audioContext;
if (tag === undefined) {
// ignore request to pause sound as nothing is currently playing
lowLag.msg("pauseSoundAudioContext nothing to pause");
return;
}
// find currently playing (running) audio and pause it (suspend)
if (context !== undefined) {
// ref: https://developer.mozilla.org/en-US/docs/Web/API/AudioContext/suspend
if (context.state === 'running') {
lowLag.msg("pauseSoundAudioContext " + tag);
context.suspend().then(function() {
lowLag.msg("pauseSoundAudioContext suspended " + tag);
}).catch(function(e) {
lowLag.msg("pauseSoundAudioContext suspend error for " + tag + ". Error: " + e);
});
} else if (context.state === 'suspended') {
// ignore request to pause sound - it's already suspended
lowLag.msg("pauseSoundAudioContext already suspended " + tag);
} else if (context.state === 'closed') {
// ignore request to pause sound - it's already closed
lowLag.msg("pauseSoundAudioContext already closed " + tag);
} else {
lowLag.msg("pauseSoundAudioContext unknown AudioContext.state for " + tag + ". State: " + context.state);
}
}
}
this.stopSoundAudioContext = function() {
// not passing in a "tag" parameter because we are playing all audio in one channel
var tag = lowLag.currentPlayingTag;
if (tag === undefined) {
// ignore request to stop sound as nothing is currently playing
lowLag.msg("stopSoundAudioContext nothing to stop");
return;
} else {
lowLag.msg("stopSoundAudioContext " + tag);
}
// find current playing audio and stop it
var source = lowLag.audioBufferSources[tag];
if (source !== undefined) {
if (typeof(source.noteOff) == "function") {
source.noteOff(0);
} else {
source.stop();
}
lowLag.msg("stopSoundAudioContext stopped " + tag);
lowLag.audioBufferSources[tag] = undefined;
lowLag.currentPlayingTag = undefined;
}
}
this.switchSoundAudioContext = function(autoplay) {
lowLag.msg("switchSoundAudioContext " + (autoplay ? 'and autoplay' : 'and do not autoplay'));
if (lowLag.currentTag && lowLag.currentTag == 'audio1') {
lowLag.currentTag = 'audio2';
} else {
lowLag.currentTag = 'audio1';
}
if (autoplay) {
lowLag.playSoundAudioContext();
}
}
this.changeSoundAudioContext = function(tag, autoplay) {
lowLag.msg("changeSoundAudioContext to tag " + tag + " " + (autoplay ? 'and autoplay' : 'and do not autoplay'));
if(tag === undefined) {
lowLag.msg("changeSoundAudioContext tag is undefined");
return;
}
lowLag.currentTag = tag;
if (autoplay) {
lowLag.playSoundAudioContext();
}
}
this.msg = function(m) {
m = "-- lowLag " + m;
console.log(m);
}
}
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.8.0/jquery.min.js"></script>
<script>
// AudioContext
$(document).ready(function() {
lowLag.init();
lowLag.load(['https://coubsecure-s.akamaihd.net/get/b86/p/coub/simple/cw_looped_audio/f0dab49f867/083bf409a75db824122cf/med_1550250381_med.mp3'], 'audio1');
lowLag.load(['https://coubsecure-s.akamaihd.net/get/b173/p/coub/simple/cw_looped_audio/0d5adfff2ee/80432a356484068bb0e15/med_1550254045_med.mp3'], 'audio2');
// starts with audio1
lowLag.changeSoundAudioContext('audio1', false);
});
// ----------------
// Audio Element
$(document).ready(function() {
var $audioElement = $('#audioElement');
var audioEl = $audioElement[0];
var audioSources = {
"audio1": "https://coubsecure-s.akamaihd.net/get/b86/p/coub/simple/cw_looped_audio/f0dab49f867/083bf409a75db824122cf/med_1550250381_med.mp3",
"audio2": "https://coubsecure-s.akamaihd.net/get/b173/p/coub/simple/cw_looped_audio/0d5adfff2ee/80432a356484068bb0e15/med_1550254045_med.mp3"
};
playAudioElement = function() {
audioEl.play();
}
pauseAudioElement = function() {
audioEl.pause();
}
stopAudioElement = function() {
audioEl.pause();
audioEl.currentTime = 0;
}
switchAudioElement = function(autoplay) {
var source = $audioElement.attr('data-source');
if (source && source == 'audio1') {
$audioElement.attr('src', audioSources.audio2);
$audioElement.attr('data-source', 'audio2');
} else {
$audioElement.attr('src', audioSources.audio1);
$audioElement.attr('data-source', 'audio1');
}
if (autoplay) {
audioEl.play();
}
}
changeAudioElement = function(tag, autoplay) {
var source = $audioElement.attr('data-source');
if(tag === undefined || audioSources[tag] === undefined) {
return;
}
$audioElement.attr('src', audioSources[tag]);
$audioElement.attr('data-source', tag);
if (autoplay) {
audioEl.play();
}
}
changeAudioElement('audio1', false); // starts with audio1
});
</script>
<h1>
AudioContext (api)
</h1>
<button onClick="lowLag.play();">Play</button>
<button onClick="lowLag.pause();">Pause</button>
<button onClick="lowLag.stop();">Stop</button>
<button onClick="lowLag.switch(true);">Swtich</button>
<button onClick="lowLag.change('audio1', true);">Play 1</button>
<button onClick="lowLag.change('audio2', true);">Play 2</button>
<hr>
<h1>
Audio Element (api)
</h1>
<audio id="audioElement" controls loop preload="auto" src="">
</audio>
<br>
<button onClick="playAudioElement();">Play</button>
<button onClick="pauseAudioElement();">Pause</button>
<button onClick="stopAudioElement();">Stop</button>
<button onClick="switchAudioElement(true);">Switch</button>
<button onClick="changeAudioElement('audio1', true);">Play 1</button>
<button onClick="changeAudioElement('audio2', true);">Play 2</button>
Apple decided (to save money on celluar) to not pre-load <audio> and <video> HTML elements.
From the Safari Developer Library:
In Safari on iOS (for all devices, including iPad), where the user may
be on a cellular network and be charged per data unit, preload and
autoplay are disabled. No data is loaded until the user initiates it.
This means the JavaScript play() and load() methods are also inactive
until the user initiates playback, unless the play() or load() method
is triggered by user action. In other words, a user-initiated Play
button works, but an onLoad="play()" event does not.
This plays the movie: <input type="button" value="Play" onClick="document.myMovie.play()">
This does nothing on iOS: <body onLoad="document.myMovie.play()">
I don't think you can bypass this restriction, but you might be able to.
Remember: Google is your best friend.
Update: After some experimenting, I found a way to play the <audio> with JavaScript:
var vid = document.createElement("iframe");
vid.setAttribute('src', "http://yoursite.com/yourvideooraudio.mp4"); // replace with actual source
vid.setAttribute('width', '1px');
vid.setAttribute('height', '1px');
vid.setAttribute('scrolling', 'no');
vid.style.border = "0px";
document.body.appendChild(vid);
Note: I only tried with <audio>.
Update 2: jsFiddle here. Seems to work.
Unfortunately, the only way to make it work properly in Safari we need to use WebAudio API, or third-party libs to handle this. Check the source code here (it's not minified)
https://drums-set-js.herokuapp.com/index.html
https://drums-set-js.herokuapp.com/app.js
Same issue. I tried to preload it via different ways. Finally I wrapped animation logic to "playing" callback. So this logic should work only if file loaded and playing started, but as a result I see that animation logic already started, and audio playing with around 2 seconds delay.
It's braking my mind, how it can has delay if audio already called "playing" callback?
Audio Context resolved my issue.
The simplest example I found here
https://developer.mozilla.org/en-US/docs/Web/API/Body/arrayBuffer
getData - preparing your audio file;
then you can play it with source.start(0);
This link missed how to get audioCtx you can copy it here
let audioCtx = new (window.AudioContext || window.webkitAudioContext)();
your audio files are loaded once then cached.. playing the sounds repeatedly, even after page refresh, did not cause further HTTP requests in Safari..
i just had a look at one of your sounds in an audio editor - there was a small amount of silence at the beginning of the file.. this will manifest as latency..
is the Web Audio API a viable option for you?
I am having this same issue. What is odd is that I am preloading the file. But with WiFi it plays fine, but on phone data, there is a long delay before starting. I thought that had something to do with load speeds, but I do not start playing my scene until all images and the audio file are loaded. Any suggestions would be great. (I know this isn't an answer but I thought it better that making a dup post).
I would simply create <audio autoplay /> dom element on click, this works in all major browsers - no need to handle events and trigger play manually
if you want to respond to audio status change manually - I would suggest to listen for play event instead of loadeddata - it's behavior is more consistent in different browsers
If you have a small/short audio file that doesn't require a lot of audio clarity, you can convert the audio file to base64 encoding.
This way the audio file will be text based and doesn't have latency related to downloading the audio file, since iOS downloads the audio pretty much when it's played.
On one hand, it's nice what iOS does to prevent abuse. On the other hand, it's annoying when it gets in the way of legitimate usage.
Here's a base64 encoder for audio files.