DOM Exception 12 when trying to stream MP3 through websocket - javascript

I am currently working on a small project where I want to split an mp3 into frames, send them to a client (browser) through a websocket and then play them back using WebAudio (webkitAudioContext). My server is running nodejs and to transfer the data as binary, I use binaryJS. The browser I am testing with is Chrome 25.0.1354.0 dev, running on Ubuntu 12.04.
I have gotten as far as successfully splitting the mp3 into frames, or, at least, based on my tests, it seems to work. If I write the frames back into a file, mplayer has no problem playing back the file and also parses the header correctly. Each frame is stored in a nodejs Buffer of the correct size and the last byte of the buffer is always the first byte before the next sync word.
As an initial test, I am only sending the first MP3 frame. The client receives the frame successfully (stored in an ArrayBuffer), and the buffer contains the correct data. However, when I call decode, I get the following message:
Uncaught Error: SyntaxError: DOM Exception 12
My function, where I call decodeAudio, looks like this:
streamDone = ->
bArray = new Uint8Array(arr[0].byteLength)
console.log "Stream is done, bytes", bArray.length
context.decodeAudioData bArray, playAudio, err
The initial frame that I am trying to deocde, can be found here.
I have been banging my head in the wall for a couple of days now trying to solve this. Has anyone managed to solve this and sucessfully decoded mp3 frames, and see what I do wrong? I have found two related question on StackOverflow, but the answers did not help me solve my problem. However, according to the accepted answer here, my frame should qualify as a valid mp3 chunk and, thus, be decoded.
Thanks in advance for any help!

Turns out that a break and some fresh eyes can work wonders, a general code cleanup solved the issue. If anyone is interested in the code, I published it here.

Related

WebAudioAPI decodeAudioData() giving null error on iOS 14 Safari

I have an mp3 audio stream player that works well in every desktop browser, using MediaSourceExtensions with a fallback to WebAudioAPI for those browsers that do not support MSE. iOS Safari is one such browser, and should theoretically support mp3 decoding via the Web Audio API without issues.
I've been struggling to get iOS Safari to properly play the mp3 audio chunks that are being returned from the stream. So far, it's the only browser that seems to have issues and I can't for the life of me figure out what's going on. Sadly, there isn't a whole lot of information on corner cases like this and the other questions here on StackOverflow haven't been any help.
Here's the relevant part of my js code where things are getting hung up. It's a callback function for an async fetch() process that's grabbing the mp3 data from the stream.
async function pushStream(value) {
// Web Audio streaming for browsers that don't support MSE
if (usingWebAudio) {
// convert the stream UInt8Array to an ArrayBuffer
var dataBuffer = value.stream.buffer;
// decode the raw mp3 chunks
audCtx.decodeAudioData(dataBuffer, function(newData) {
// add the decoded data to the buffer
console.log("pushing new audio data to buffer");
webAudioBuffer.push(newData);
// if we have audio in the buffer, play it
if (webAudioBuffer.length) {
scheduleWebAudio();
}
}, function(e) {
console.error(e);
});
What I'm seeing is the error callback being fired and printing null: null as its error message (very helpful). Every so often, I will see the console print pushing new audio data to buffer, but this seems to only happen about once every few minutes while the stream is playing. Almost all the stream data is erroring out during the decode and the lack of useful error messages is preventing me from figuring out why.
As far as I can tell, iOS safari should support mp3 streams without any issues. It also should support the decodeAudioData() function. Most of the other answers I was able to find related to trying to play audio before the user interacts with the screen. In my case, I start the audio using a play button on the page so I don't believe that's the problem either.
One final thing, I'm developing on Windows and using the remotedebug iOS adapter. This could possibly be the reason why I'm not getting useful debug messages, however all other debug and error prints seem to work fine so I don't believe that's the case.
Thanks in advance for any help!
Unfortunately there is a bug in Safari which causes it to reject the decodeAudioData() promise with null. From my experience this happens in cases where it should actually reject the promise with an EncodingError instead.
The bug can be reproduced by asking Safari do decode an image. https://github.com/chrisguttandin/standardized-audio-context/blob/9c705bd2e5d8a774b93b07c3b203c8f343737988/test/expectation/safari/any/offline-audio-context-constructor.js#L648-L663
In general decodeAudioData() can only handle full files. It isn't capable of decoding a file in chunks. The WebCodecs API is meant to solve that but I guess it won't be available on iOS anytime soon.
However there is one trick that works with MP3s because of their internal structure. MP3s are built out of chunks themselves and any number of those chunks form a technically valid MP3. That means you can pre-process your data by making sure that each of the buffers that you pass on to decodeAudioData() begins and ends exactly at those internal chunk boundaries. The phonograph library does for example follow that principle.

Websocket saturation in Chrome, blob points to data which does not exist

I've got one of this very difficult to debug problems in my app.
I'm using websockets to receive images from my server. I'm getting around 50 images per second in binary and showing them in a canvas as image.
Everything works ok, but sometimes I'm getting an error:
FileReader error. Name: NotFoundError Message: A requested file or directory could not be found at the time an operation was processed.
My code looks like this:
wsVisualizerManager.onMessage = function(e)
{
if (e.data instanceof Blob)
{
var blob = e.data;
this.reader.readAsArrayBuffer(blob);
}
}
Basically I get a blob and I read it with a FileReader object (this.reader) to be able to visualize it as an image.
As I said everything works, but sometimes, seems that due to receive more images that the ones it can handle I start getting this error.
When I start getting this error I don't get it once, I get it many times, so my app is blocked during a lot of time. If I launch chrome development tools the error stops immediately and my app works again (so maybe this developer tool is cleaning some buffers?).
So, I guess that It is some kind of buffer which get completely full, but the think is that If I have to lose old messages I'm ok with that, but why calling onMessage with a bad formated or inexistent blob?
I don't know exactly how to debug this error, because websocket API has just a few methods and no one allows me to access a buffer or clean it.
A way of generate this error manually (otherwise I get it just twice a day) is putting one breakpoint just at the beginning of onMessage and wait some time. What happens then is that when I press play I start getting this error and after some seconds I start receiving retarded images. So seems that it's not removing the received images or at least not all of them.
I'm developing the server part too. And I'm using C++ with the library websocket++. And my chrome version is: 43.0.2357.125 (64-bit)
Any clue of how to proceed debugging this error or trying to solve the problem?

Send video stream from iOS7 camera to JavaScript Canvas in web view

First a word of caution: this question is not suitable for the faint of heart. It is an interesting challenge that I have encountered recently. See if you can solve it or help in any way to get closer to an answer, at your own risk.
Here is the problem: Create an iOS application with a standard UIWebView inside. Obtain camera stream from either camera. Send each frame in a format that can be rendered into an HTML5 canvas. Make this happen efficiently so that video stream can be displayed at 720p 30fps or higher in an iOS7 device.
So far I have not found any solution that look promissing. In fact I started with the solution that looked most ridiculus which is encoding each frame in a base64 image string and passing it to web view via stringByEvaluatingJavaScriptFromString. Here is the method that does the JPEG encoding
- (NSString *)encodeToBase64StringJPEG:(UIImage *)image {
return [UIImageJPEGRepresentation(image, 0.7) base64EncodedStringWithOptions:NSDataBase64Encoding64CharacterLineLength];
}
Inside the viewDidLoad I create and configure the capture session
_output = [[AVCaptureVideoDataOutput alloc] init];
// create a queue to run the capture on
dispatch_queue_t captureQueue=dispatch_queue_create("captureQueue", NULL);
// setup output delegate
[_output setSampleBufferDelegate:self queue:captureQueue];
// configure the pixel format (could this be the problem? Is this suitable for JPEG?
_output.videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA], (id)kCVPixelBufferPixelFormatTypeKey, nil];
[_session addOutput:_output];
[_session startRunning];
The frames are captured and converted to UIImage first.
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection {
_image = imageFromSampleBuffer(sampleBuffer);
// here comes the ridiculus part. Attempt to encode the whole image and send it to JS land:
_base64imgCmd = [NSString stringWithFormat:#"draw('data:image/png;base64,%#');", [self encodeToBase64StringJPEG:_image]];
[self.webView stringByEvaluatingJavaScriptFromString:_base64imgCmd];
}
Guess what, it did not work. XCode is showing me this error:
DemoNativeToJS[3983:1803] bool _WebTryThreadLock(bool), 0x15e9d460: Tried to obtain the web lock from a thread other than the main thread or the web thread. This may be a result of calling to UIKit from a secondary thread. Crashing now...
1 0x355435d3 WebThreadLock
2 0x3013e2f9 <redacted>
3 0x7c3a1 -[igViewController captureOutput:didOutputSampleBuffer:fromConnection:]
4 0x2c5bbe79 <redacted>
Is this error because WebView is running our of memory qouta? I should note that there is a big spike in the app memory usage just before crash. It crashes anywhere between 14MB to above 20MB depending on what the quality level is set for JPEG encoding.
I do not want to render the camera stream in native -- that wont be an interesting problem at all. I want to pass the video feed to JavaScript land and draw it inside the canvas.
For your convenience I have a minimal demo project (XCode) on github that you can use to get a guick headstart:
git clone https://github.com/arasbm/DemoNativeToJS.git
Please let me know if you have other more sane ideas for passing the data through instead of using stringByEvaluatingJavaScriptFromString. If you have other ideas or suggestions feel free to let me know in comments, but I would expect the answer to demonstrate with some code what path will work.
The crash you are experiencing is due to UIWebKit attempting to call UIKit from a background thread. The easiest way to prevent this from happening is to force stringByEvaluatingJavaScriptFromString which is making the call to UIKit to run in the main thread. You can do this by changing
[self.webView stringByEvaluatingJavaScriptFromString:_base64imgCmd];
To this
[self.webView performSelectorOnMainThread:#selector(stringByEvaluatingJavaScriptFromString:) withObject:_base64imgCmd waitUntilDone:NO];
Which will now make the call to UIKit from the main execution thread, which will be safe.

Browser throws error on creating an ObjectURL of an Image Blob after consuming lot of memory

Well, I'm running into a strange Error while programming a Web Application that receives Images from a Server via WebSockets. The Server sends about 8 images per second (.bmp) to the browser. Each image has a size of about 300KB. So that's around 2.4Mbps.
The browser receives the images as binary blob:
//WebSocket
var ws = new WebSocket("ws://192.168.0.10:1337");
//Image
var camImg = new Image();
ws.onmessage = function(msg)
{
var data = msg.data;
// handle binary messages from server
if (data instanceof Blob) camImg.src = window.URL.createObjectURL(data);
};
camImg.onload = function()
{
//draw image to canvas
canvasCont2D.drawImage(this,0,0);
//request next frame
ws.send("give me the next image!");
//delete ObjectURL
window.URL.revokeObjectURL(this.src);
};
So until this point everything runs fine. Now I'm coming for the first problem:
As I was testing this in Chrome I watched at the TaskManager to see how many resources this coding needs. I saw there one process of Chrome that started at about 90MB Memory. Each second there were add 2.4MB. So it looks like every image i receive stays in memory. Is there any possibility to prevent this? The received blobs seem to stay under resources in Chrome developer tools, btw.
Anyway this problem leads me to the second one: The memory consumption of this process rises and rises and after some time at about 400-500MB its kind of flushed and starts again at 90MB, again rising. So long, its just a memory problem. But sometimes it could happen, that the memory is not flushed and rises up to about 600MB. At this point I don't receive any new image. The console shows an error that says:
Failed to load resource: the server responded with a status of 404 (Not Found)
This error occurs in this line:
camImg.src = window.URL.createObjectURL(data);
At the moment I work around this issue by catching the error event:
camImg.onerror = function()
{
//request next frame anyway and wait for memory flush
ws.send("give me the next image!");
};
So I'm just requesting new images because after some time the memory gets flushed again (after a few seconds) and I can receive new Images.
The same problem(s) occure using Opera as well. I guess its mainly a problem with memory consumption. Maybe a bug in browsers? Or did I made a big programming error?
I would be very thankful for any help as I have no idea left, what could be causing this problem...
OS: Windows7 64bit
Chrome Version 35.0.1916.153 m
Chrome Version 38.0.2068.0 canary (64-bit) : (chrome://flags/#impl-side-painting setting makes no difference).
In a prototype I'm doing, I get exactly the same behaviour as this in chrome 35 and a recent canary build. Ok in IE and firefox. I'm running a localhost c++ websocket server about 10fps with 0.5MB images.
The chrome memory eventually goes up and something trashes the chrome too.
Moving forwards:
1) In image.onerror I call window.URL.revokeObjectURL(this.src); This seems to sort my memory leak out, but not the 404's.
2) When running under the F12 debugger things are so slow that I don't seem to get the problem. Thus on page I have 3 counters: 1) Blobs received count, 2) image.onload count and 3) image.onerror count.
After approx 900 successful loads I start getting load failures. then after maybe after 50 failures, I start getting successful loads again. This pattern keeps repeating, but the numbers seem random.( This all seems to smack of some GC related issue, but only a hunch based on experience).
3) I can fix (AKA 'bodge') this by changing ws.binaryType='arraybuffer'. I need a blob so I construct a new one based on a new Uint8Array(msg.data). Everything works fine, no load failures at all.
I'm making an unnecessary binary copy here, but it doesn't seem to make any noticeable speed difference. I'm not 100% sure what's going on here and how stable the fix is.
Most similar image loading examples on the internet don't have an onerror handler. Running such examples on my machine would result in a unexplainable memory leak. You wouldn't see the 404's unless under the debugger and lucky. There's lots of people on the internet complaining about memory leaks when loading images. Maybe it's related.
I'm going to raise this issue on the chromium forums.
hope this helps ... matt

Html5 Audio plays only once in my Javascript code

I have a dashboard web-app that I want to play an alert sound if its having problems connecting. The site's ajax code will poll for data and throttle down its refresh rate if it can't connect. Once the server comes back up, the site will continue working.
In the mean time I would like a sound to play each time it can't connect (so I know to check the server). Here is that code. This code works.
var error_audio = new Audio("audio/"+settings.refresh.error_audio);
error_audio.load();
//this gets called when there is a connection error.
function onConnectionError() {
error_audio.play();
}
However the 2nd time through the function the audio doesn't play. Digging around in Chrome's debugger the 'played' attribute in the audio element gets set to true. Setting it to false has no results. Any ideas?
I encountered this just today, after more searching I found that you must set the source property on the audio element again to get it to restart. Don't worry, no network activity occurs, and the operation is heavily optimized.
var error_audio = new Audio("audio/"+settings.refresh.error_audio);
error_audio.load();
//this gets called when there is a connection error.
function onConnectionError() {
error_audio.src = "audio/"+settings.refresh.error_audio;
error_audio.play();
}
This behavior is expressed in chrome 21. FF doesn't seem to mind setting the src twice either!
Try setting error_audio.currentTime to 0 before playing it. Maybe it doesn't automatically go back to the beginning
You need to implement the Content-Range response headers, since Chrome requests the file in multiple parts via the Range HTTP header.
See here: HTML5 <audio> Safari live broadcast vs not
Once that has been implemented, both the play() function and setting the currentTime property should work.
Q: I’VE GOT AN AUDIOBUFFERSOURCENODE, THAT I JUST PLAYED BACK WITH NOTEON(), AND I WANT TO PLAY IT AGAIN, BUT NOTEON() DOESN’T DO ANYTHING! HELP!
A: Once a source node has finished playing back, it can’t play back more. To play back the underlying buffer again, you should create a new AudioBufferSourceNode and call noteOn().
Though re-creating the source node may feel inefficient, source nodes are heavily optimized for this pattern. Plus, if you keep a handle to the AudioBuffer, you don't need to make another request to the asset to play the same sound again. If you find yourself needing to repeat this pattern, encapsulate playback with a simple helper function like playSound(buffer).
Q: WHEN PLAYING BACK A SOUND, WHY DO YOU NEED TO MAKE A NEW SOURCE NODE EVERY TIME?
A: The idea of this architecture is to decouple audio asset from playback state. Taking a record player analogy, buffers are analogous to records and sources to play-heads. Because many applications involve multiple versions of the same buffer playing simultaneously, this pattern is essential.
source:
http://updates.html5rocks.com/2012/01/Web-Audio-FAQ
You need to pause the audio just before its end and change the current playing time to zero, then play it.
Javascript/Jquery to control HTML5 audio elements - check this link - explains How to handle/control the HTML5 audio elements?. It may help you!
Chrome/Safari have fixed this issue in newer versions of the browser and the above code now works as expected. I am not sure the precise version it was fixed in.

Categories

Resources