How to play multiple audio sequentially with ionic Media plugin - javascript

I am trying to play multiple audio files with ionic media plugin : https://ionicframework.com/docs/native/media. but I am having a hard time making it work as a playlist without using a timeout function.
Here is what I have tried out
playOne(track: AudioFile): Promise<any> {
return new Promise(async resolve =>{
const AudFile = await this.media.create(this.file.externalDataDirectory+track.trackUrl);
await resolve(AudFile.play())
});
}
Then to play All , I have this :
async playAll(tracks: AudioFile[]): Promise<any>{
let player = (acc, track:AudioFile) => acc.then(() =>
this.playOne(track)
);
tracks.reduce(player, Promise.resolve());
}
This way they are all playing at the same time.
But If The PlayOne method is wrapped in a timeout function, the interval of the milli seconds set on the timeout exists among the play list, but one does not necessarily finish before the other starts and sometimes it waits for a long time before the subsequent file is plaid.
The timeout implementation looks like this :
playOne(track: AudioFile): Promise<any> {
return new Promise(async resolve =>{
setTimeout(async ()=>{
const AudFile = await this.media.create(this.file.externalDataDirectory+track.trackUrl);
await resolve(AudFile.play())
},3000)
});
}
Digging into ionic wrapper of the plugin, the create method looks like this :
/**
* Open a media file
* #param src {string} A URI containing the audio content.
* #return {MediaObject}
*/
Media.prototype.create = function (src) {
var instance;
if (checkAvailability(Media.getPluginRef(), null, Media.getPluginName()) ===
true) {
// Creates a new media object
instance = new (Media.getPlugin())(src);
}
return new MediaObject(instance);
};
Media.pluginName = "Media";
Media.repo = "https://github.com/apache/cordova-plugin-media";
Media.plugin = "cordova-plugin-media";
Media.pluginRef = "Media";
Media.platforms = ["Android", "Browser", "iOS", "Windows"];
Media = __decorate([
Injectable()
], Media);
return Media;
}(IonicNativePlugin));
Any suggestion would be appreciated

You may get it working by looping over your tracks and await playOne on each track.
async playAll(tracks: AudioFile[]): Promise<any> {
for (const track of tracks) {
await this.playOne(track);
}
}
If I'm not mistaking play function doesn't block until playing the audio file is finished. It doesn't return a promise either. A work around would be to use a seTimeout for the duration of the track
playOne(track: AudioFile): Promise<any> {
return new Promise((resolve, reject) => {
const audFile = await this.media.create(this.file.externalDataDirectory+track.trackUrl);
const duration = audFile.getDuration(); // duration in seconds
AudFile.play();
setTimeout(() => {
resolve();
},
duration * 1000 // setTimeout expect milliseconds
);
});
}

I eventually got it to work with a recursive function. This works as expected.
PlayAllList(i,tracks: AudioFile[]){
var self = this;
this.Audiofile = this.media.create(this.file.externalDataDirectory+tracks[i].trackUrl);
this.Audiofile.play()
this.Audiofile.onSuccess.subscribe(() => {
if ((i + 1) == tracks.length) {
// do nothing
} else {
self.PlayAllList(i + 1, tracks)
}
})
}
Then
this.PlayAllList(0,tracks)
If there is any improvement on this, I will appreciate.

I think you will be better of with the Web Audio API. I have used it before, and the possibilities are endless.
Apparently it can be used in Ionic without issues:
https://www.airpair.com/ionic-framework/posts/using-web-audio-api-for-precision-audio-in-ionic
I have used it on http://chordoracle.com to play multiple audio samples at the same time (up to 6 simultaneous samples for each string of the guitar). In this case i also alter their pitch to get different notes.
In order to play multiple samples, you just need to create multiple bufferSources:
https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createBufferSource
Some links to get you started:
https://www.w3.org/TR/webaudio/
https://medium.com/better-programming/all-you-need-to-know-about-the-web-audio-api-3df170559378

Related

How to get total audio duration from multiple audio files

I am trying to get the total duration of audio from an array of audio paths.
Here is what the array would look like:
var sound_paths = ["1.mp3","2.mp3",...]
I have looked at this post, which helped:
how to get audio.duration value by a function
However, I do not know how to implement this over an array. The idea is that I want to loop over each audio file, get its duration, and add it to a "sum_duration" variable.
I cannot seem to figure out a way to do this over a for loop. I have tried Promises (which I am admittedly new at): (Note the functions come from a class)
getDuration(src,cb){
// takes a source audio file and a callback function
var audio = new Audio();
audio.addEventListener("loadedmetadata",()=>{
cb(audio.duration);
});
audio.src = src;
}
getAudioArrayDuration(aud_path_arr){
// takes in an array of audio paths, finds the total audio duration
// in seconds
return new Promise((resolve)=>{
var duration = 0;
for(const aud_path in aud_path_arr){
var audio = new Audio();
audio.src = aud_path;
audio.onloadedmetadata = ()=>{
console.log(audio.duration);
duration += audio.duration;
}
resolve(duration);
}
});
}
However, this obviously does not work, and will just return the duration value of 0.
How can I loop over audio files and return the total audio duration of each file summed?
I think, in this case, working with promises is the right approach, time to get used to them ;) Try to remember, a promise will fullfil your wish in the near future. What makes your problem harder is that you have an array of files to check, each will need to separately be covered by it's own Promise, just so that your program can know when all of them have been fullfilled.
I always call my 'promised' getters 'fetch...', that way I know it'll return a promise instead of a direct value.
function fetchDuration(path) {
return new Promise((resolve) => {
const audio = new Audio();
audio.src = path;
audio.addEventListener(
'loadedmetadata',
() => {
// To keep a promise maintainable, only do 1
// asynchronous activity for each promise you make
resolve(audio.duration)
},
);
})
}
function fetchTotalDuration(paths) {
// Create an array of promises and wait until all have completed
return Promise.all(paths.map((path) => fetchDuration(path)))
// Reduce the results back to a single value
.then((durations) => durations.reduce(
(acc, duration) => acc + duration,
0,
))
;
}
At some point, your code is going to have to deal with this asynchronous stuff, I honestly believe that Promises are the easiest way to do this. It takes a little getting used to, but it'll be worth it in the end. The above could be used in your code something along the lines of:
window.addEventListener('DOMContentLoaded', () => {
fetchTotalDuration(["1.mp3","2.mp3",...])
.then((totalDuration) => {
document.querySelector('.player__total-duration').innerHTML = totalDuration;
})
;
});
I hacked this together real quick, so you'll have to adapt it to your function structure, but it's a working code snippet that should send you in the right direction.
Simply keep track of which audio files have been loaded, and if that matches the number of audio files queried, you call the callback with the total duration.
You should also take failing requests into account, so that if the loadedmetadata event never fires, you can react accordingly (by either falling back to 0 duration for that file, or throwing an Exception, etc.).
const cb = function(duration) {
console.log(`Total duration: ${duration}`);
};
let sound_paths = ["https://rawcdn.githack.com/anars/blank-audio/92f06aaa1f1f4cae365af4a256b04cf9014de564/5-seconds-of-silence.mp3","https://rawcdn.githack.com/anars/blank-audio/92f06aaa1f1f4cae365af4a256b04cf9014de564/2-seconds-of-silence.mp3"];
let totalDuration = 0;
let loadedSounds = [];
sound_paths.map(src => {
const audio = new Audio();
audio.addEventListener("loadedmetadata", ()=>{
totalDuration += audio.duration;
loadedSounds.push(audio);
if ( loadedSounds.length === sound_paths.length ) {
cb( totalDuration );
}
});
audio.src = src;
});

Tone JS - Transport.stop(); does not work with scheduled events

I am using Tone JS for a project and I am using Transport.scheduleOnce to schedule events with the Sampler. Here is what I have so far, also here is a fiddle of it (you may need to click run a couple times to hear the audio come through when fiddle initially loads)
My code:
const sound = 'https://archive.org/download/testmp3testfile/mpthreetest.mp3';
let samplerBuffer;
const sampler = new Promise((resolve, reject) => {
samplerBuffer = new Tone.Sampler(
{
A1: sound
},
{
onload: () => {
resolve()
}}
).toMaster();
})
sampler.then(() => {
Tone.Transport.scheduleOnce(() => {
samplerBuffer.triggerAttack(`A1`, `0:0`)
});
Tone.Transport.start();
setTimeout(() => {
console.log('Now should be stopping');
Tone.Transport.stop();
},1000)
})
I am trying to stop the audio from playing after 1 second using the Transport.stop() method however it does not seem to work. I think I have followed the docs as I should so where am I going wrong?
Tone.Transport is triggering your sample.
What you want to do is either use the Tone.Player if you only want to play sounds like a "jukebox".
If you really need a sampler, then should to look into Envelopes because the Sampler uses one.
In short: The Tone.Transport like the maestro in a concert. Transport is only setting the time (only the BPM not the playback speed). Tone.Transport.start() will trigger all registered instruments (in your case the Sampler) to start doing whatever you programmed them to do. If you want to stop the Sampler from playing in this mode. You can do samplerBuffer.releaseAll()
const sound = 'https://archive.org/download/testmp3testfile/mpthreetest.mp3';
let samplerBuffer;
const sampler = new Promise((resolve, reject) => {
samplerBuffer = new Tone.Sampler(
{
A1: sound
},
{
onload: () => {
resolve()
}}
).toMaster();
})
sampler.then(() => {
Tone.Transport.scheduleOnce(() => {
samplerBuffer.triggerAttack(`A1`, `0:0`)
});
Tone.Transport.start();
setTimeout(function() {
console.log('Now should be stopping');
samplerBuffer.releaseAll();
// samplerBuffer.disconnect();
},1000)
})
https://jsfiddle.net/9zns7jym/6/

Function returned undefined, expected Promise or value and unable to delete old data from firebase database using cloud functions

I'm trying to delete multiple nodes on my database that are older than 12hrs. I"m using a pub/sub function to trigger this event. I don't know if my code is actually looping through all nodes as I'm not using the onWrite, onCreate database triggers on specific. Here is the image sample of the database
this is the pub/sub code
exports.deletejob = functions.pubsub.topic('Oldtask').onPublish(() => {
deleteOldItem();
})
and the deleteOldItem function
function deleteOldItem(){
const CUT_OFF_TIME = 12 * 60 * 1000; // 12 Hours in milliseconds.
//var ref = admin.database().ref(`/articles/${id}`);
const ref = admin.database().ref(`/articles`);
const updates = {};
ref.orderByChild('id').limitToLast(100).on('value', function (response) {
var index = 0;
response.forEach(function (child) {
var element = child.val();
const datetime = element.timestamp;
const now = Date.now();
const cutoff = now - datetime;
if (CUT_OFF_TIME < cutoff){
updates[element.key] = null;
}
});
//This is supposed to be the returened promise
return ref.child(response.key).update(updates);
});
If there's something I'm doing wrong, I'll like to know. The pub/sub is triggered with a JobScheduler already setup on google cloud scheduler
You had several problems in your code that were giving you trouble.
The handling of promises wasn't correct. In particular, your top level function never actually returned a promise, it just called deleteOldItems().
You should use the promise form of once() instead of calling on() with a callback since you don't want to install a listener in this case, you just need the result a single time, and you want to handle it as part of a promise chain.
To delete nodes, you should call remove() on a reference to that node. It also generates a promise for you to use here.
You didn't calculate 12 hours in milliseconds properly, you calculated 12 minutes in milliseconds :)
Here's what I came up with. It uses an http function instead of a pubsub function as well as adding a log statement for my testing, but the modification you need should be trivial/obvious (just change the prototype and remove the response after deleteOldItems, but do make sure you keep returning the result of deleteOldItems()):
const functions = require('firebase-functions');
const admin = require('firebase-admin');
function deleteOldItems() {
const CUT_OFF_TIME = 12 * 60 * 60 * 1000; // 12 Hours in milliseconds.
const ref = admin.database().ref('/articles');
return ref.orderByChild('id').limitToLast(100).once('value')
.then((response) => {
const updatePromises = [];
const now = Date.now();
response.forEach((child) => {
const datetime = child.val().timestamp;
const cutoff = now - datetime;
console.log(`processing ${datetime} my cutoff is ${CUT_OFF_TIME} and ${cutoff}`);
if (CUT_OFF_TIME < cutoff){
updatePromises.push(child.ref.remove())
}
});
return Promise.all(updatePromises);
});
}
exports.doIt = functions.https.onRequest((request, response) => {
return deleteOldItems().then(() => { return response.send('ok') });
}
While I have not tested it, I'm pretty sure this will work to include inside your original function call for cloud scheduler:
exports.deletejob = functions.pubsub.topic('Oldtask').onPublish(() => {
return deleteOldItems();
})
Of course, this is still more complicated than you need, since ordering by id doesn't really gain you anything here. Instead, why not just use the query to return the earliest items before the cut off time (e.g. exactly the ones you want to remove)? I've also switched to limitToFirst to ensure the earliest entries get thrown out, which seems more natural and ensures fairness:
function deleteOldItems() {
const cutOffTime = Date.now() - (12 * 60 * 60 * 1000); // 12 Hours earlier in milliseconds.
const ref = admin.database().ref('/articles');
return ref.orderByChild('timestamp').endAt(cutOffTime).limitToFirst(100).once('value')
.then((response) => {
const updatePromises = [];
response.forEach((child) => {
updatePromises.push(child.ref.remove())
});
return Promise.all(updatePromises);
});
}
If you do this on more than a few items, of course, you probably want to add an index on the timestamp field so the range query is more efficient.

Change playout delay in WebRTC stream

I'm trying to cast a live MediaStream (Eventually from the camera) from peerA to peerB and I want peerB to receive the live stream in real time and then replay it with an added delay. Unfortunately in isn't possible to simply pause the stream and resume with play since it jump forward to the live moment.
So I have figured out that I can use MediaRecorder + SourceBuffer rewatch the live stream. Record the stream and append the buffers to MSE (SourceBuffer) and play it 5 seconds later.
This works grate on the local device (stream). But when I try to use Media Recorder on the receivers MediaStream (from pc.onaddstream) is looks like it gets some data and it's able to append the buffer to the sourceBuffer. however it dose not replay. sometime i get just one frame.
const [pc1, pc2] = localPeerConnectionLoop()
const canvasStream = canvas.captureStream(200)
videoA.srcObject = canvasStream
videoA.play()
// Note: using two MediaRecorder at the same time seem problematic
// But this one works
// stream2mediaSorce(canvasStream, videoB)
// setTimeout(videoB.play.bind(videoB), 5000)
pc1.addTransceiver(canvasStream.getTracks()[0], {
streams: [ canvasStream ]
})
pc2.onaddstream = (evt) => {
videoC.srcObject = evt.stream
videoC.play()
// Note: using two MediaRecorder at the same time seem problematic
// THIS DOSE NOT WORK
stream2mediaSorce(evt.stream, videoD)
setTimeout(() => videoD.play(), 2000)
}
/**
* Turn a MediaStream into a SourceBuffer
*
* #param {MediaStream} stream Live Stream to record
* #param {HTMLVideoElement} videoElm Video element to play the recorded video in
* #return {undefined}
*/
function stream2mediaSorce (stream, videoElm) {
const RECORDER_MIME_TYPE = 'video/webm;codecs=vp9'
const recorder = new MediaRecorder(stream, { mimeType : RECORDER_MIME_TYPE })
const mediaSource = new MediaSource()
videoElm.src = URL.createObjectURL(mediaSource)
mediaSource.onsourceopen = (e) => {
sourceBuffer = mediaSource.addSourceBuffer(RECORDER_MIME_TYPE);
const fr = new FileReader()
fr.onerror = console.log
fr.onload = ({ target }) => {
console.log(target.result)
sourceBuffer.appendBuffer(target.result)
}
recorder.ondataavailable = ({ data }) => {
console.log(data)
fr.readAsArrayBuffer(data)
}
setInterval(recorder.requestData.bind(recorder), 1000)
}
console.log('Recorder created')
recorder.start()
}
Do you know why it won't play the video?
I have created a fiddle with all the necessary code to try it out, the javascript tab is the same code as above, (the html is mostly irrelevant and dose not need to be changed)
Some try to reduce the latency, but I actually want to increase it to ~10 seconds to rewatch something you did wrong in a golf swing or something, and if possible avoid MediaRecorder altogether
EDIT:
I found something called "playout-delay" in some RTC extension
that allows the sender to control the minimum and maximum latency from capture to render time
https://webrtc.org/experiments/rtp-hdrext/playout-delay/
How can i use it?
Will it be of any help to me?
Update, there is new feature that will enable this, called playoutDelayHint.
We want to provide means for javascript applications to set their preferences on how fast they want to render audio or video data. As fast as possible might be beneficial for applications which concentrates on real time experience. For others additional data buffering may provide smother experience in case of network issues.
Refs:
https://discourse.wicg.io/t/hint-attribute-in-webrtc-to-influence-underlying-audio-video-buffering/4038
https://bugs.chromium.org/p/webrtc/issues/detail?id=10287
Demo: https://jsfiddle.net/rvekxns5/
doe i was only able to set max 10s in my browser but it's more up to the UA vendor to do it's best it can with the resources available
import('https://jimmy.warting.se/packages/dummycontent/canvas-clock.js')
.then(({AnalogClock}) => {
const {canvas} = new AnalogClock(100)
document.querySelector('canvas').replaceWith(canvas)
const [pc1, pc2] = localPeerConnectionLoop()
const canvasStream = canvas.captureStream(200)
videoA.srcObject = canvasStream
videoA.play()
pc1.addTransceiver(canvasStream.getTracks()[0], {
streams: [ canvasStream ]
})
pc2.onaddstream = (evt) => {
videoC.srcObject = evt.stream
videoC.play()
}
$dur.onchange = () => {
pc2.getReceivers()[0].playoutDelayHint = $dur.valueAsNumber
}
})
<!-- all the irrelevant part, that you don't need to know anything about -->
<h3 style="border-bottom: 1px solid">Original canvas</h3>
<canvas id="canvas" width="100" height="100"></canvas>
<script>
function localPeerConnectionLoop(cfg = {sdpSemantics: 'unified-plan'}) {
const setD = (d, a, b) => Promise.all([a.setLocalDescription(d), b.setRemoteDescription(d)]);
return [0, 1].map(() => new RTCPeerConnection(cfg)).map((pc, i, pcs) => Object.assign(pc, {
onicecandidate: e => e.candidate && pcs[i ^ 1].addIceCandidate(e.candidate),
onnegotiationneeded: async e => {
try {
await setD(await pc.createOffer(), pc, pcs[i ^ 1]);
await setD(await pcs[i ^ 1].createAnswer(), pcs[i ^ 1], pc);
} catch (e) {
console.log(e);
}
}
}));
}
</script>
<h3 style="border-bottom: 1px solid">Local peer (PC1)</h3>
<video id="videoA" muted width="100" height="100"></video>
<h3 style="border-bottom: 1px solid">Remote peer (PC2)</h3>
<video id="videoC" muted width="100" height="100"></video>
<label> Change playoutDelayHint
<input type="number" value="1" id="$dur">
</label>

Tone.js Tone.BufferSource: buffer is either not set or not loaded

Tone.BufferSource: buffer is either not set or not loaded. This error occurs in try/catch block. It only occurs, when I trigger update function constantly or sometimes randomly.
When this error occurs my audio just turns off for a brief moment.
The logic behind my code. When program starts create function is invoked in the constructor creating Tone.sequence later on when I change/update track parameters i call update fuction,
which calls loopprocessor with new/updated tracks. But when i trigger update which triggers loopprocessor function it runs into tone.sourcebuffer is either not set ir loaded. How can i work around this problem?
My code:
import Tone from "tone";
export function create(tracks, beatNotifier){
const loop = new Tone.Sequence(
loopProcessor(tracks, beatNotifier),
[...new Array(16)].map((_, i) => i),
"16n"
);
Tone.Transport.bpm.value = 120;
Tone.Transport.start();
return loop;
}
export function update(loop, tracks, beatNotifier){
loop.callback = loopProcessor(tracks, beatNotifier);
return loop;
}
function loopProcessor (tracks, beatNotifier) {
const urls = tracks.reduce((acc, {name}) => {
return {...acc, [name]: `http://localhost:3000/src/sounds/${name}.[wav|wav]`};
}, {});
const keys = new Tone.Players(urls, {
fadeOut: "64n"
}).toMaster();
return (time, index) => {
beatNotifier(index);
tracks.forEach(({name, vol, muted, note, beats}) => {
if (beats[index]) {
try {
var vel = Math.random() * 0.5 + 0.5;
keys
.get(name)
.start(time, 0, note, 0, vel);
keys
.get(name).volume.value = muted
? -Infinity
: vol;
} catch(e) {
console.log("error", e);
}
}
});
};
}
I had this problem recently and found a solution that worked for my case.
Tone.js doesn't like it when you initialise an audio buffer inside a function (what you're doing when you call new Tone.Players inside loopprocessor).
To get around this at the top of your code declare a new global variable buffer1 = new Tone.Buffer(url1) for each url that you need. https://tonejs.github.io/docs/r13/Buffer
Then inside loopprocessor just replace urls with each buffer and a name tag and you shouldn't have any problems. So new Tone.Players({"name1": buffer1, "name2": buffer2, ...})

Categories

Resources