I have my javascript audio player working with .mp3s, but I'm not sure how to add a second audio format (.ogg) so the files will also play in Firefox. Can anyone help with this. Here is the array code:
var urls = new Array();
urls[0] = 'audio/song1.mp3';
urls[1] = 'audio/song2.mp3';
urls[2] = 'audio/song3.mp3';
urls[3] = 'audio/song4.mp3';
var next = 0;
The easiest way to play sounds is with SoundManager 2 with uses Flash and HTML 5 when available.
Related
I installed the following JS library and I'm using the "player" samples.
https://github.com/goldfire/howler.js
Do you have any idea how to start the audio based on the time? I saw in the readme that they say to use the URL method with the seek parameter but I don't want to use a parameter in the URL but I want the time (in minutes or seconds) to be calculated when the site is accessed and for the audio to be started at that precise point.
I previously used this JS and it worked correctly:
var audio = new Audio();
audio.src = "MI24H.mp3";
ORA_ATTUALE_IN_SECONDI=(new Date().getHours()*60*60)+(new Date().getMinutes()*60)+(new Date().getSeconds());
audio.currentTime = ORA_ATTUALE_IN_SECONDI
audio.play();
function playAudio() {
audio.play();
}
I solved using the seek variable in this way:
TIME_IN_SECONDS=(new Date().getHours()*60*60)+(new Date().getMinutes()*60)+(new Date().getSeconds());
var seek = TIME_IN_SECONDS || 0;
I used this js to export everything in my canvas as an mp4 video. I succeeded in exporting it as a video but the video is always 0 in time.
Here's the js I used
https://github.com/antimatter15/whammy
Here's the code I have so far that can download the canvas and elements inside but not the animation.
var canvas_video = document.querySelector('canvas').getContext('2d');
canvas_video.save();
console.log(canvas_video);
var encoder = new Whammy.Video(15);
var progress = document.getElementById('progress');
encoder.add(canvas_video);
console.log("1",encoder);
encoder.compile(false, function(output){
//var url = (window.URL || window.URL).createObjectURL(output);
var url = URL.createObjectURL(output);
console.log(url);
document.getElementById('download_link').href = url;
});
When I checked on the console to debug it, it shows encodeFrame 0.
Can anyone advise on what should I do and if I miss something?
For anyone who's still looking for the answer -
That Library will only output to .webm instead of .mp4.
As far as I know, except for Chrome, no other browser support webm playback. So, use Chrome to view the video. Other browsers will stuck at 0 time.
One of the sound files won't play. The two following pieces of code are identical except for the file name.
This doesn't work:
var rewardSound = new Audio("audio/WrongAnswerSound.wav");
function rightAnswer(){
rewardSound.play();
}
However this works fine:
var rewardSound = new Audio("audio/CorrectAnswerSound.wav");
function rightAnswer(){
rewardSound.play();
}
The image is from the File Manager in cPanel. I can play both sounds from the File Manager itself. But, I cann't play the WrongAnswerSound.wav from the JS code. What am I doing wrong?
You kind of have the right idea.
Set a variable for the correct sound by creating a new Audio object:
var correctSound = new Audio("audio/CorrectAnswerSound.wav");
Set a variable for the wrong sound by creating another new Audio object:
var wrongSound = new Audio("audio/WrongAnswerSound.wav");
Now you both of these new objects already hold a play method that they get from the Audio object. So all you have to do to get these sounds to play is this:
correctSound.play();
wrongSound.play();
So I've used WebAudioAPI to create a music from code. I've used OfflineAudioContext to create a music and it's oncomplete event is similar to this:
function(e) {
var audioCtx = new (window.AudioContext || window.webkitAudioContext)();
var song = audioCtx.createBufferSource();
song.buffer = e.renderedBuffer;
song.connect(audioCtx.destination);
song.start();
}
Which plays the sound. And it works. But I would like to instead store it as an <audio> element, because it's easier to play, loop, pause and stop, which I need to reuse the song.
Is it possible? I'm googling for days, but I can't find how!
The idea was to use var song = new Audio() and something to copy the e.renderedBuffer to it.
Ok, so I found this code floating around: http://codedbot.com/questions-/911767/web-audio-api-output . I've created a copy here too: http://pastebin.com/rE9a1PaX .
I've managed to use this code to create and store an audio on the fly, using all the function provided in this link.
offaudioctx.oncomplete = function(e) {
var buffer = e.renderedBuffer;
var UintWave = createWaveFileData(buffer);
var base64 = btoa(uint8ToString(UintWave));
songsarr.push(document.createElement('audio'))
songsarr[songsarr.length-1].src = "data:audio/wav;base64," + base64;
console.log("completed!");
};
It's not pretty, but it works. I'm leaving everything here in case someone finds an easier way.
I am aware of how to clone an object, but I'm wondering, how do I clone an audio object? Should I clone it differently than I would clone an object?
To "illustrate" what I mean:
var audio = new Audio("file.mp3");
var audio2 = $.extend({}, audio); // Clones `audio`
Is this the correct way to do this?
Reason why I'm asking this is that I want to be able to play the same sound multiple times simultaneously.
I had exactly the same predicament as originally raised. The following worked perfectly well for me :
var audio2 = audio.cloneNode();
This question is ancient in javascript years. I think your code is (was) downloading the audio again and that was the cause of your delay. If you grab the audio file once and store it in a blob, you can then use that blob as the source for new audio events.
let fileBlob;
fetch("file.mp3")
.then(function(response) {return response.blob()})
.then(function(blob) {
fileBlob=URL.createObjectURL(blob);
new Audio(fileBlob); // forces a request for the blob
});
...
new Audio(fileBlob).play() // fetches the audio file from the blob.
You can also do a lot of stuff with the web audio api, but it has a slightly steeper learning curve.
#1
let audio_2 = audio_1.slice()
or
#2
let audio_2 = new Blob([audio_1])