In my sample asp.net/vb.net project for HTML5 audio, using Matt Diamond's recorder.js, when I grant any browser access to my microphone, I can hear every noise around me in the headset speakers (e.g., someone typing next to me, people speaking, etc). I'm using a Logitech USB headset, but am not sure the specific model as they're provided through work.
Does anyone know if this is a browser-related issue with the microphone (or possibly resulting from the headset I'm using)? I'm trying to find a different headset to test with, but work typically buys the same model for everyone.
I've tried a few other sites that have demos for recording and play back, and can still hear everything around me once I grant the microphone access (although some demos seem do better at not picking up the background noises).
My knowledge of audio is very limited as I've never worked with audio at the level of normalizing, setting gain, etc. Any resources or suggestions would be greatly appreciated!
I can include a copy of the aspx and vb.net pages from my sample project if that would be helpful. Thanks!
Well, the issue turned out to be my fault due to uncommenting a line of Javascript that caused the audio to feedback directly. Guess that's part of learning something new. :)
Here's the function I used from Matt Diamond's example with the line commented out that I must have incorrectly uncommented at some point...
function startUserMedia(stream) {
var input = audio_context.createMediaStreamSource(stream);
console.log('Media stream created.');
// Uncomment if you want the audio to feedback directly
// input.connect(audio_context.destination);
try {
recorder = new Recorder(input);
} catch (e) {
alert('Error: Unable to record. Please check your sound settings, and allow the browser access to your microphone when prompted.');
console.log('Error: Recorder object could not be created. Msg: ' + e.toString());
}
}
Thank you for the responses that were provided! I'm sure I'll have more audio questions to come, lol.
Related
I'm building a video player in node.js with nw.js. It will run offline.
One of the features I'd like to incorporate on this application is the possibility of to play the video dragging and drop it into a "box".
The restriction I'm facing on this implementation is the necessity of encode the video in order to play it, using, for example, the function readAsDataURL(). As discussed in this post, it is not possible to get the fullpath of a file.
"Upload" the entire video, for me, doesn't make sense as long it is already stored in the user's hd.
If he/she try to play The Big Bang Theory (about 20 minutes), it won't be problem to wait 2 or 3 minutes, differently of try to watch The Lord of Rings.
Is there good workaround to deal with this problem?
I appreciate any help.
UPDATE:
I was thinking about the copy and paste of file in a field, as long as with this action is possible to get its URL. But it is not the best thing in terms of user experience...
I've managed. I've changed readAsDataURL() for createObjectURL().
For the sake of reference, my code
var video = document.createElement("video");
video.controls = true;
document.body.appendChild(video);
video.src = (window.URL||window.webkitURL).createObjectURL(file);
video.play();
Now, the The Lord of Rings marathon of my users is saved.
Ok, my new website has just gone live, delivered through Google Apps. On a lark, I decided to include a javascript / HTML5 Lunar Lander clone (Martian Lander) which I wrote as an exercise a while back. The game works fine when I open it locally, but when it's delivered through GAE, the sounds don't seem to load on every system. In mobile safari, none of them load. In safari on the desktop, they all load reliably on my computer, but not on some other computers. In Chrome (on the desktop) it seems to work, but in Chrome in iOS, only one sound loads. On the desktop, it always seems to be the same sound which fails to load (explode1.mp3), which is the smallest of the sounds I'm loading. As you can see, if you click that link, the sound downloads fine from the server...
At first the problem seemed to be related to case sensitivity, so I switched the case in the filename, but that fix didn't keep working. This is a problem, as my loading bar is directly tied to how many resources have loaded, so it just sits there waiting for a GET request with no reply... Has anyone experienced anything like this, where a GET receives no reply on a specific resource, but loading the resource directly works fine?
I should say that I'm very new to most of these technologies, so it seems quite likely to me that I just made some novice mistake. Unfortunately, I'm not sure what those novice mistakes would be, seeing as I'm a novice!
Here's the code I use to load the sounds:
function loadSound(soundName) {
var newElement = document.createElement("audio");
newElement.addEventListener("canplaythrough", assetLoaded, false);
document.body.appendChild(newElement);
var audioType = supportedAudioFormat(newElement);
if (audioType == "") {
alert("no audio support");
return;
}
newElement.setAttribute("src", "lander/sounds/" + soundName + "." + audioType);
console.log("loading sound " + newElement.src + "...");
return newElement;
}
and...
function assetLoaded() {
var assetName = this.src;
numAssetsLoaded++;
console.log("loaded asset " + numAssetsLoaded + " (" + assetName + ")");
if (numAssetsLoaded >= numAssetsToLoad) {
shipSpriteSheet.removeEventListener("load", assetLoaded, false);
pointImage.removeEventListener("load", assetLoaded, false);
thrustAudioElement.removeEventListener("canplaythrough", assetLoaded, false);
explosionAudioElement.removeEventListener("canplaythrough", assetLoaded, false);
victoryAudioElement.removeEventListener("canplaythrough", assetLoaded, false);
musicTrackElement.removeEventListener("canplaythrough", assetLoaded, false);
gameState = GAME_STATE_INIT;
}
}
If you take a look at the console output, you'll see that all of the sounds begin loading (particularly explode1.mp3) but don't necessarily finish and call assetLoaded...
UPDATE:
It seems to be the consensus is that I should not be using mp3 (incidentally, I'm already using mp3, AAC, AND ogg, but defaulting to mp3), and also that I should use the Web Audio API. These are both welcome pieces of input, and I will make the necessary changes. However, I still don't have an answer to the original question, which is, "Why does one particular sound not load reliably on desktop while the others load with no problem?" Anybody wanna take a crack at that one? Or is the answer going to be something like, "These things are highly unpredictable, and there's no way to fix it except by switching to a more dependable methodology, like Web Audio API"?
UDATE:
Here's an excerpt from my app.yaml file, which, I gather, helps GAE setup the server.
- url: /(.*\.(mp3|ogg|wav))
static_files: \1
upload: (.*\.(mp3|ogg|wav))
Some things to be aware of:
You shouldn't use MP3 for HTML5 games.
You will need to dual-encode all your sounds to both AAC (.m4a) and Ogg Vorbis (.ogg) to ensure they can be played everywhere, since there is no one format which can be played everywhere.
You must ensure your server has the correct MIME types for the audio files. Some browsers will happily play audio if the server says it has the wrong MIME type; others will fail silently. For AAC and Ogg Vorbis the types are audio/mp4 and audio/ogg respectively.
Most mobile devices can only play one sound at a time, and iOS generally doesn't let you play audio unless it's in a user-initiated input event (such as touchstart).
You'll probably want to use the Web Audio API where supported (Chrome and iOS 6+) since playback is more reliable and polyphonic even on iOS - but note iOS still mutes the Web Audio API until a user input event.
This is not a direct answer to your question why sound is not being played, but more like what you should do with your game sound effects.
For game sound effects I suggest you use HTML5 Web Audio API which gives more control over how sounds are played (pitch of the sound effect, less delay in playback, etc):
http://www.html5rocks.com/en/tutorials/webaudio/intro/
iOS 6+ supports Web Audio https://developer.apple.com/technologies/ios6/
Web audio is not supported in FF yet, but the support is coming
I'm building a simple Javascript jukebox using the latest SoundManager2 for audio playback, with local MP3 files being the source. I've got file loading and playing sorted, and at the moment I'm trying to get access to the ID3 info of these MP3 files, but the onid3() callback is not firing. I'm using Flash and have verified that ID3 info is present in the files. Below is my implementation of onid3():
function playNextSongInQueue()
{
// Get the first element of the songQueue array
var nextSongInQueue = songQueue.shift();
// Start playback from the queue
var jukeboxTune = soundManager.createSound({
id: 'currentTune',
url: 'audio/' + nextSongInQueue.name,
onload: function() {
this.play();
},
onid3: function() {
alert('ID3 present!');
},
onfinish: function() {
this.destruct(); // Destroy this sound on finish
songFinish(); // Run the songFinish() function, so decide what to do next
}
});
jukeboxTune.load();
//jukeboxTune.play(); // The jukebox running!
songPlaying = true; // Set songPlaying flag
updateSongQueueDisplay(); // Refresh the song queue display (for debug)
return nextSongInQueue.name;
}
The other callbacks work fine, but the onid3() alert never comes up. I even separated the load and play portions of audio playback to see if that helped. SoundManager spots that onid3() is there because it switches usePolicyFile to true - seeing as the MP3s are local I am assuming I don't need to worry about the cross-domain XML file.
Can anybody shed light on why this isn't working? I've scoured Google looking for implementations that work but have come up with nothing helpful. I've seen Jacob Seidelin's pure Javascript workaround but would rather stick with SoundManager if possible, and would rather not use a PHP solution.
Thanks,
Adam
This problem is probably too esoteric for any solid answers, so I decided to investigate possible Javascript solutions outside the SM2 library.
I started with Nihilogic's library for reading ID3v1 tags (at http://blog.nihilogic.dk/2008/08/reading-id3-tags-with-javascript.html), but moved to antimatter15's js-id3v2 library (https://github.com/antimatter15/js-id3v2) as it can read ID3v2 tags. Adapting code from the provided example I have managed to successfully parse the main tags required when the MP3s are loaded via the <input> control.
For local files, i speak of "user local files" (not "server" local files) i get some success with id3v2.js
To get ID3, SM2 need a cross domain on the mp3 host, if it's another domain.
Plus i have encountered difficulties with Soundcloud as they redirect MP3 to dynamic Amazon S3 storage... so i have to do a PHP script to guest final URL and then SM2 can get proper crossdomain.xml (Check https://getsatisfaction.com/schillmania/topics/displaying_waveformdata_of_soundcloud_hosted_track_prompts_securityerror_error_2122 )
The problem is both S3 links and local user files (blob) do have a short expiration delay.
Good luck !
Users run my HTML files locally, straight from a CD.
I want to allow them to choose a bunch of videos and create a playlist on the fly.
This works very well if I run a web server but when I run the HTML itself it fails.
The player is created (using swfobject) and all my other code runs but playerReady never fires so I can never get the current play list to add to it.
Any ideas on how I can fix this or, more likely, work around it?
If the player is created, but you're not getting a playerReady, one of two things could be happening.
There's another playerReady on the page that's catching your playerReady. Make sure that there's just one playerReady on the page.
You haven't enabled JavaScript access for Flash. The code for that would look like this:
SWFObject:
var so = new SWFObject('player.swf','ply','470','320','9','#000000');
so.addParam('allowfullscreen','true');
so.addParam('allowscriptaccess','always');
so.addParam('wmode','opaque');
so.addVariable('file','video.flv');
so.write('mediaspace');
I should also note that there are some additional Flash security restrictions because you're accessing the player from disk. Namely, you can't access both a disk source and a network source (the Internet) simultaneously.
Best,
Zach
Developer, LongTail Video
I have a dashboard web-app that I want to play an alert sound if its having problems connecting. The site's ajax code will poll for data and throttle down its refresh rate if it can't connect. Once the server comes back up, the site will continue working.
In the mean time I would like a sound to play each time it can't connect (so I know to check the server). Here is that code. This code works.
var error_audio = new Audio("audio/"+settings.refresh.error_audio);
error_audio.load();
//this gets called when there is a connection error.
function onConnectionError() {
error_audio.play();
}
However the 2nd time through the function the audio doesn't play. Digging around in Chrome's debugger the 'played' attribute in the audio element gets set to true. Setting it to false has no results. Any ideas?
I encountered this just today, after more searching I found that you must set the source property on the audio element again to get it to restart. Don't worry, no network activity occurs, and the operation is heavily optimized.
var error_audio = new Audio("audio/"+settings.refresh.error_audio);
error_audio.load();
//this gets called when there is a connection error.
function onConnectionError() {
error_audio.src = "audio/"+settings.refresh.error_audio;
error_audio.play();
}
This behavior is expressed in chrome 21. FF doesn't seem to mind setting the src twice either!
Try setting error_audio.currentTime to 0 before playing it. Maybe it doesn't automatically go back to the beginning
You need to implement the Content-Range response headers, since Chrome requests the file in multiple parts via the Range HTTP header.
See here: HTML5 <audio> Safari live broadcast vs not
Once that has been implemented, both the play() function and setting the currentTime property should work.
Q: I’VE GOT AN AUDIOBUFFERSOURCENODE, THAT I JUST PLAYED BACK WITH NOTEON(), AND I WANT TO PLAY IT AGAIN, BUT NOTEON() DOESN’T DO ANYTHING! HELP!
A: Once a source node has finished playing back, it can’t play back more. To play back the underlying buffer again, you should create a new AudioBufferSourceNode and call noteOn().
Though re-creating the source node may feel inefficient, source nodes are heavily optimized for this pattern. Plus, if you keep a handle to the AudioBuffer, you don't need to make another request to the asset to play the same sound again. If you find yourself needing to repeat this pattern, encapsulate playback with a simple helper function like playSound(buffer).
Q: WHEN PLAYING BACK A SOUND, WHY DO YOU NEED TO MAKE A NEW SOURCE NODE EVERY TIME?
A: The idea of this architecture is to decouple audio asset from playback state. Taking a record player analogy, buffers are analogous to records and sources to play-heads. Because many applications involve multiple versions of the same buffer playing simultaneously, this pattern is essential.
source:
http://updates.html5rocks.com/2012/01/Web-Audio-FAQ
You need to pause the audio just before its end and change the current playing time to zero, then play it.
Javascript/Jquery to control HTML5 audio elements - check this link - explains How to handle/control the HTML5 audio elements?. It may help you!
Chrome/Safari have fixed this issue in newer versions of the browser and the above code now works as expected. I am not sure the precise version it was fixed in.