I'm currently using node-lame to encode a raw PCM input stream, and I have the following code in Node.JS that successfully outputs binary MP3 chunks:
server.on('request', (req, res) => {
encoded.pipe(res);
});
I try to request this code inside of my front-end interface with code like the following:
var audio = new Audio('http://localhost:8000/a.mp3'); // the above
audio.play();
However, as the audio source is a continuous input stream, the content just keeps getting downloaded without end:
Instead, I want to be able to play the chunks as they are downloaded.
I can access http://localhost:8000/a.mp3 in an application like VLC or Quicktime Player, and the audio delivery works fine; I'm just stumped as to how to best do this on the web.
Thanks in advance.
This code works for us:
<audio id="music" preload="all">
<source src="http://localhost:8000/a.mp3">
</audio>
<script>
let music = document.getElementById('music');
music.play();
</script>
Related
I am not able to play MP4 (HD) video on UI received from the django backend. I am using normal javascript on UI and Django on the backend. Please find the backend code snippet:
file = FileWrapper(open(path, 'rb')) #MP4 file path is media/1648477263566_28-03-2022 19:51:05_video.mp4
response = HttpResponse(file, content_type=content_type)
response['Content-Disposition'] = 'attachment; filename=my_video.mp4'
return response
The video plays perfectly on Postman but cant play on the UI screen. The UI code is below:
function getUploadedImageAndVideo(combined_item_id){
request = {}
request["combined_item_id"] = combined_item_id;
var xhttp = new XMLHttpRequest();
xhttp.onreadystatechange = function() {
if (this.readyState == 4 && this.status == 200) {
vdata = this.responseText;
var src1 = document.getElementById('src1');
src1.setAttribute("src", "data:video/mp4;base64,"+vdata);
//src1.setAttribute("src", vdata); //doesnt work either
var src2 = document.getElementById('src2');
src2.setAttribute("src", "data:video/mp4;base64,"+vdata);
//src2.setAttribute("src", vdata); //doesnt work either
return
}
}
xhttp.open("POST", port + host + "/inventory_apis/getUploadedImageAndVideo", true);
xhttp.setRequestHeader("Accept", "video/mp4");
xhttp.setRequestHeader("Content-type", "application/json");
xhttp.setRequestHeader("X-CSRFToken", getToken());
xhttp.send( JSON.stringify(request) );
}
on html side:
<video controls="">
<source type="video/webm" src="" id="src1">
<source type="video/mp4" src="" id="src2">
</video>
Network Response (200 OK) of function call is: "ftypmp42 ... isommp42 ... mdat ... ó! ... °}b ... $¥Ð ..." very long text of the video.
I am not able to play video on the UI Side. Please Help.
Browser used: Chrome and Mozilla.
*An alternative is to directly play from media url but here I want to edit video on backend itself on purpose. So I’m stuck on this issue.
Looking at "ftypmp42 ... isommp42 ... mdat ... ó! ... °}b ... $¥Ð ..."
MP4 is divided into two parts.
First is MOOV for metadata (which needs to be processed, before playback can begin). For example the metadata tells all the byte positions of all the different frames, without this metadata then the decoder cannot begin playback.
Second is MDAT which is the actual media data (the audio/video data without headers, since such info now exists in MOOV instead).
It seems your video has MDAT appearing first so you must wait for the MDAT bytes to pass through before you reach the metadata. In other words, your file must be completely downloaded before it can play.
Solution:
Use a tool to move your MOOV atom to the front of file. You can try commandline tools like FFmpeg or MP4Box or an app like Handbrake.
I'm building an application using PizzicatoJS + HowlerJS. Those libraries essentially allow me to play multiple audio files at the same time. Imagine a 4 audio tracks with each track containing an instrument like guitar, bass, drums, vocals, etc..
Everything plays fine when using PizzicatoJS's Group functionality or running a forEach loop on all my Howl sounds and firing .play(). However, I would like to download the final resulting sound I am hearing from my speakers. Any idea on how to approach that?
I looked into OfflineAudioContext, but I am unsure on how to use it to generate an audio file. It looks like it needs an Audio source like an <audio> tag. Is what I'm trying to do possible? Any help is appreciated.
I think the OfflineAudioContext can help with your use case.
Let's say you want to create a file with a length of 10 seconds. It should contain one sound playing from the start up to second 8. And there is also another sound which is supposed to start at second 5 and should last until the end. Both sounds are AudioBuffers (named soundBuffer and anotherSoundBuffer) already.
You could arrange and combine the sounds as follows.
const sampleRate = 44100;
const offlineAudioContext = new OfflineAudioContext({
length: sampleRate * 10,
sampleRate
});
const soundSourceNode = new AudioBufferSourceNode({
buffer: soundBuffer
});
soundSourceNode.start(0);
soundSourceNode.stop(8);
soundSourceNode.connect(offlineAudioContext.destination);
const anotherSoundSourceNode = new AudioBufferSourceNode({
buffer: anotherSoundBuffer
});
anotherSoundSourceNode.start(5);
anotherSoundSourceNode.stop(10);
anotherSoundSourceNode.connect(offlineAudioContext.destination);
offlineAudioContext
.startRendering()
.then((audioBuffer) => {
// save the resulting buffer as a file
});
Now you can use a library to turn the resulting AudioBuffer into an encoded audio file. One library which does that is for example audiobuffer-to-wav.
I want to record audio from video element alongside recording from canvas.
I have
var stream = canvas.captureStream(29);
Now I am adding audioTrack of video to the stream.
var vStream = video.captureStream();
stream.addTrack(vStream.getAudioTracks()[0]);
But this slows down the performance with every video added. As captureStream() is very heavy on video and it also requires a flag to be switched on in Chrome. Is there a way of creating only audio MediaStream from video element without using captureStream().
Yes, you can use the Web Audio API's method createMediaElementSource which will grab the audio from your mediaElement, and then the createMediaStreamDestination method, which will create an MediaStreamDestination node, which contains an MediaStream.
You then just have to connect it all, and you've got your MediaStream with your MediaElement's audio.
// wait for the video starts playing
vid.play().then(_=> {
var ctx = new AudioContext();
// create an source node from the <video>
var source = ctx.createMediaElementSource(vid);
// now a MediaStream destination node
var stream_dest = ctx.createMediaStreamDestination();
// connect the source to the MediaStream
source.connect(stream_dest);
// grab the real MediaStream
out.srcObject = stream_dest.stream;
out.play();
});
The video's audio will be streamed to this audio elements : <br>
<audio id="out" controls></audio><br>
The original video element : <br>
<video id="vid" crossOrigin="anonymous" src="https://dl.dropboxusercontent.com/s/bch2j17v6ny4ako/movie720p.mp4?dl=0" autoplay></video>
Note that you could also connect more sources to this stream, and also that you can combine it with an other video stream with the new MediaStream([stream1, stream2]) Constructor (It's currently the only way to combine different streams on FF, until this bug is fixed, should be soon though).
So I am trying to stream and download audio file to the browser.
My files are stored on a seperate drive to the web application.
Currently I am using java to send the file:
public static Result recording(Long id) {
PhoneCall currentCall = PhoneCall.getCall(id);
if (currentCall != null) {
File wavFile = new File("D:\\audio\\"
+ currentCall.fileName);
return ok(wavFile);
} else {
return badRequest();
}
}
This works fine in that i can stream the audio and it plays nicely. And I am playing the audio through a html5 element.
But I am trying to use a slider control and using audio.currentTime to set the current time so a user can play midway through but it seems that I cannot set this value. I also cannot get the duration of the audio file using audio.duration.
Even when i use <audio controls="controls" it does not allow me to play midway through
Is there something I am missing? Can I get javascript to download the whole file before it starts playing so that i can get all this information? Or is there another way to send this using play framework?
I have a simple webpage where you can stream your webcam. I would like to take this stream and send it somewhere, but apparently I can't really access the stream itself. I have this code to run the stream:
navigator.webkitGetUserMedia({video: true}, gotStream, noStream);
And in gotStream, I tried many things to "redirect" this stream somewhere else, for example:
function gotStream(stream) {
stream_handler(stream)
//other stuff to show webcam output on the webpage
}
or
function gotStream(stream) {
stream.videoTracks.onaddtrack = function(track){
console.log("in onaddtrack");
stream_handler(track);
}
//other stuff to show webcam output on the webpage
}
But apparently the gotStream function gets called only once at the beginning, when the user grants permissions to the webcam to stream. Moreover the stream variable is not the stream itself but an object with some properties inside. How am I supposed to access the stream itself and redirect it wherever I want?
EDIT: You may be familiar with webglmeeting, a sort of face2face conversation apparently developed on top of WebRTC. I think that script is sending somehow the stream of data from one point to the other. I would like to achieve the same by understanding how to get the stream of data in the first place.
RE-EDIT: I don't want a conversion to image and sending the latter, I would like to work with the stream of data itself.
If you mean to steam your camera to somewhere in PNG or JPG so I will use canvas like this
HTML
<video id="live" width="320" height="240" autoplay></video>
<canvas width="320" id="canvas" height="240" style="display:none;"></canvas>
JS ( jQuery )
var video = $("#live").get()[0];
var canvas = $("#canvas");
var ctx = canvas.get()[0].getContext('2d');
navigator.webkitGetUserMedia("video",
function(stream) {
video.src = webkitURL.createObjectURL(stream);
}
)
setInterval(
function () {
ctx.drawImage(video, 0, 0, 320,240);
var data = canvas[0].toDataURL("image/jpeg");
},1000);