I have a somewhat working system that
Produces audio on a server in to a 1 second WAV file
Reads the WAV file and sends it through a websocket
Websocket sends the binary data to AudioContext.decodeAudioData
Decoded audio is buffered until 4 packets (4 seconds)
Buffer is processed and sent to AudioBufferSourceNode.start(time) where time = (clip_count * duration)
So if I have 4 audio clips, the calls would look like
AudioBufferSourceNode.start(0);
AudioBufferSourceNode.start(1);
AudioBufferSourceNode.start(2);
AudioBufferSourceNode.start(3);
I thought this would perfectly schedule 4 seconds of audio, but I seem to be facing clock issues, perhaps because I am expecting the audio clock to be perfect. I have already used a gain node to remove clicks between each sound clip (1 second) but I start to get timing issues either right away or after a long period of time. Basically, in the worst case, my audio plays like this
---------------------- ----------- ----------- -----------
| 1 second | 1 second | | 950ms | | 900ms | | 850ms |
---------------------- ----------- ----------- -----------
gap gap gap
In this diagram, "1 second" and "#ms" is how much audio is playing. It should always be 1 second. As the audio progresses, it seems to also develop gaps. I guess even when I tell the audio context to play a file at exactly 0, its fine, but other scheduled audio clips may or may not be on time.
Is this correct, or is there something else going wrong in my system? Is there 100% reliability that I could schedule an audio clip to play at the exact right time, or do I need to add in some calculations to figure a +/- of a few ms when to play?
It looks like the thing that serves the purpose of this task is AudioWorkletNode.
According to AudioBufferSourceNode documentation:
The AudioBufferSourceNode interface is an AudioScheduledSourceNode which represents an audio source consisting of in-memory audio data, stored in an AudioBuffer. It's especially useful for playing back audio which has particularly stringent timing accuracy requirements, such as for sounds that must match a specific rhythm and can be kept in memory rather than being played from disk or the network. To play sounds which require accurate timing but must be streamed from the network or played from disk, use a AudioWorkletNode to implement its playback.
This case exactly implements streaming from the network. AudioBufferSourceNode is not designed to be updated on the fly from the network.
What can lead to desync:
By the nature of the javascript scheduler, there is no guarantee to execute code at the exact time. The node might perform another job at the same time which leads to delay in sending information
The timer runs next tick after sending all the data, which can take some time
The client-side scheduler has even more restrictions than server-side ones. Generally, the browser can perform around 250 timers per second (one each 4ms).
The used API is not designed for that flow
Recommendations:
Always keep the buffer. If by some reason frames from buffer had played already it might be reasonable to request new ones faster.
Increase buffer on the fly. After receiving two messages it is fine to start playing, but it might be reasonable to increase count of buffered messages on the fly to, maybe, something like 15 seconds.
Prefer another tool to work with the connection and data transferring. Nginx will serve perfectly. In case the client will have a slow connection it will "hold" node till data will be transferred.
In case of connection drops for a second (on the mobile network, for example) there should be something to restore state from the proper frame, update buffer and do all of that without interruptions.
Related
I'm making a website for someone who wants to have a constantly looping audio file that streams at the same time to all users. For instance, if a user were to access the website at 12:30, and the website were to have looped the audio file at 12, the user would hear the audio 30 minutes into the file.
In investigating possible methods to accomplish this, readableStream seemed like a good option; but I'm not sure how to implement it, which I attribute to myself needing to study more javascript in order to understand the resources that explain it.
Might anyone be able to help me understand how to code readableStream to accomplish this?
I currently have the website fully prepared aside from this crucial point and, currently, the website just starts playing the audio file from the beginning for any user who loads for the page.
Not really my field but happy to try as requested! Wiser heads please step in. Assume you are using a fixed size (say mp3) file that is downloaded, and not streaming. And use your server clock rather than the user's clock for consistency among users.
Be aware that there normally has to be a user interaction (like a click) for media to play.
Use the modulus operator (%) to get the start point. e.g. with simple numbers, if the time now is +120 seconds from your "day start", and the file is 35 seconds long, you should scrub to 15 secs in the file. 120 % 35 = 15.
You can set the ""scrub-to"" time in the file URI when you deliver the html
<audio
src="file.mp3#t=00:00:15"
loop
...
</audio>
or use javascript, e.g. see HTML 5 <audio> - Play file at certain time point and others,
which would also allow you to further increment the scrub after the download, while waiting for the user to click.
Hope that can get you started.
I'm creating the application that plays live mp3 stream from server.
Basically it's node.js server which pipes PulseAudio output encoded as mp3 to browser using HTTP. On browser side it's simple HTML5 audio element.
I want to have as little latency as possible. Unfortunately simple html5 audio element introduces about 8 seconds (250 kilobytes) of lag (starts playing 8 seconds after first byte was downloaded) using Chrome browser. I assume that it's some kind of buffer but I can't find any information about how to tweak it.
Is there a way to change this buffer length? I'd like it to be as short as 0.5-1 second, not 8.
Can any one guide me on how to achieve this.. I am listing them in pointers..
A linux binary captures frames from the locally attached webcamera
and stores them in a folder. This is a continuous process. The
images are stored numerically.
I have a webserver which gives a output of the latest images received from the webcamera. This is a PHP file which gets the recent most image received and prints out.
What I have now is a javascript which refreshes the image every second and displays in the img tag.
Though it works the output is slow and updates slowly one frame at a time.
I am trying to display the images quickly and in a way it should
look like a mjpeg movie being played (not that it has to be so good
as I learned from the forums that the http does have its overhead)
<script type="text/javascript">
function refresh(){
document.images["pic1"].src="/latimage.php?camid=$selectedcamid&ref=" + new Date();
setTimeout('refresh()', 1000);}
if(document.images)window.onload=refresh;
</script>
<img src='/latimage.php?camid=$selectedcamid' id='pic1'>
Above code works perfect. But my unsatisfied mind wants to display the frames obtained from the webcam quickly..like displaying atleast 3 to 4 frames per second.
As I understood from my searches so far it is not too feasible to do
the refresh act too quickly as the HTTP process does take time.
I am trying to find some details on getting this done using a method
by which I can prefetch 100 frames into a image array (I would call
it buffering) and start displaying one image at a time at the rate
of 3 images / second.
Whiles displaying the images the older images should be removed from
the array and the latest ones fetched should be inserted in the end.
Thus the looping is infinite.
I am sorry for asking too many questions..I am unable to find any proper direction to start off with. I can do the above in .net windows application quite easily but in web browser I am unable to get any ideas. I am not sure if jQuery image array or json or simple javascript would do.
I need some guidance please..
If you need to capture the camera output to disk, then I suggest capturing the camera output as video (at 3 FPS) and then streaming that video file to your browser using WebSockets. Here is an example of doing that. If you are willing to run nginx on your server then live_thumb is a complete solution that captures and streams video via WebSockets.
On the other hand, if your goal is just to view the output of the camera and you don't need to store the video, you could consider using WebRTC and running a browser at both ends and then just hooking up the media stream. In other words one browser (perhaps a headless variant) would run on the system with your camera and would stream the video to your other browser using WebRTC. With WebRTC you could get much higher frame rates and your bandwidth would probably still be significantly lower than sending individual images at a slow frame rate.
I want to randomly seek to different points in a ~30 minute video every 30 seconds. The filesize will be 100mb. When I seek does the player start loading from that point or does it have to load the entire file and then find that time within it?
It depends on the browser. If we are talking about a modern browser then when you seek, they will typically send a new http request to the server containing a Range: header, indicating what "chunk" of the file they want to load. This would only be for a browser utilizing http 1.1 or higher. I think if the browser supports html5 video then you can be fairly certain that they will be using http 1.1. Keep in mind though that the client will typically always be loading something. So if you seek to 5 seconds into the vid it will essentially start loading the entire thing again until another seek happens.
No, it starts loading from the given timestamp, as long as the browser knows the duration of the video.
I'm working on a video heavy site, and an event triggers a few videos to start playing, but one of the larger ones unloads itself after a second or two, resulting an error:
FAILED TO LOAD RESOURCE ERROR
even though it was loaded a moment ago.
Staggering the buffering of each video helps slightly, but the unloading still happens occasionally. Any suggestions on managing this issue would be greatly appreciated.
There is perhaps too little information in the post to give an exact answer but I would look into bandwidth (computer and internet) and video bit-rates as a first point. What dimensions are the videos and at what bit-rate are they encoded at would be an important question (HD, PAL/NTSC, custom).
Bandwidth problems can happen at several stages:
Is the server capable of delivering the total bit-rate required (sum of the video bit-rates + overhead) which must be continuously at this bit-rate as a minimum. This is not just about the internet bandwidth the server has available but also factors such as loading from storage, server load and so forth.
Is the internet connection (the bottle neck point) able to pass through this bit rate. If total bit-rate of videos supersede the available bandwidth incl. overhead you won't be able to load the streams fast enough
Is the computer able to buffer and decode all these video streams simultaneously. If the videos are for example HD (even if they are scaled down in the browser window the initial frame will be decoded at full frame dimension) the computer would need to decode and compute a huge amount of data even if it is hardware accelerated.
It could be any point really but I would perhaps start with point 3 if you already know your internet connection is more than capable (including overhead). Also if the browser uses the disc as a temporary cache for the buffer the disc will become a factor as well (seek times, fragmentation).
To eliminate you should find out what the bit-rate is for each video, sum them and see if your internet connection can handle it, if it does, do a test against the server to see if it has problems delivering the content streams. If none shows any sign of problems try to run your application with videos from local disc (through a local server) and see if your computer is capable of decoding all simultaneously.
Even if unlikely there are also the possibilities of (packet) errors in transmission regardless of good bandwidth as well as the video stream's encoding themselves (general file errors, atypical encoding scheme in case these are video container files etc.).