I am trying to stream an mp3 via javascript/html5 or flash on my webserver.
The javascript/html front end calls a script on the server which begins generating an mp3 file.
I want the file to begin playback immediately once enough audio is buffered, and to continue playing.
I am new to streaming and want to know the best method to do this in both apache server and windows iis.
At first I thought I would need to use an actual streaming protocol, such as rtp or rtsp, but it seems like it maybe better using a custom javascript player that takes the mp3 file in via http.
However the concern with http is that the file transfer will stop once the incomplete tail of the mp3 file is hit. Is there a way around this, perhaps altering the config to make the apache/iis server wait until the file is complete before terminating the transfer?
I also looked into m3u files but this seems to require a complete mp3 file, although I'm not clear on it.
Any advice/solutions/examples will be appreciated.
I don't have the exact solution, but you can try this http://learningsfromdotnet.blogspot.com/2011/11/playing-mp3-using-audio-tag-html-5-in.html.
I was trying to stream MP3 file from IIS server, had issue with Chrome because it makes a request with "Range" header. If you can make similar request with "Range" header and from server stream the piece of MP3, it might work.
Related
I wanna enable users of my web app to upload videos with a maximum lenght of 10s and cropped/scaled to a certain resolution. There for users should be able to trim there selected video to 10s before uploading it with a simple editor.
Is there any library or examples enabling client side video editting to cut video length as well as croping before uploading it to a server? I found some canvas approaches for filters, single video frames and export to webM videos but nothing bringing it all together. So anyone done that before?
Apreciate any ideas :)
Tipicaly video processing is a server thing because it's easier to find and run complex (often compiled) libraries (like ffmpeg) there than in browser and it may cause less performance problems for end user. Anyway I think there are two options:
1. Process video on server - send file and configuration
The first approach assume that you prepare client side "editor" based on canvas which simulate video editing. After setup of all filters, crops etc. the client might send original video file and video processing configuration which would be used on a server to do the same thing
Depends on which language you prefer on backend side implementation might be different so I don't give you ready snippet of code.
Of course you can switch order of tasks and upload original file at first place then smiulate video processing on client side and after all send mentioned configuration to backend and process video.
2. Process video within WebAssembly
If you really need to keep everything on client side you can try with WebAssembly ports of libraries like https://github.com/ffmpegwasm/ffmpeg.wasm and send already processed video file to server.
in my current project i have a video stream that ffmpeg encodes to a segmented mp4. that encoded data is piped into an application that sends that data to whomever connects to that application through a websocket. when a client connects i make sure to send the ftyp and the moov boxes first and then send the most recent segments recieved from ffmpeg.
on the client side i just pass all binary data from the websocket to MSE.
The problem i am facing is that this works if the client is connected from the very start and gets all the fragments that ffmpeg pipes out, but it does not work if the client connects in after ffmpeg sends its first few fragments.
My question is:
Is it possible for MSE to play a fragmented mp4 from the middle when it is also provided the init segments?
If it is possible then how would that need to be implemented?
if it isnt possible then what format would allow me to stream live video over a websocket?
Is it possible for MSE to play a fragmented mp4 from
the middle when it is also provided the init segments?
Yes, This is exactly what fragmented (segmented) mp4 was designed to do
If it is possible then how would that need to be implemented?
The way you describe your implementation is correct. Send init fragment, followed by most recent AV fragment. Which means you have a different problem or a bug in your implementation.
I have an Node.js server for serve audio and video files by stream.
This server accepts ranges (byte serving).
My goal is request an audio or video from the client (browser) and only fetch an small chunk of the file (320kb for example) and before finish of reproduce this chunk, request the next range of the file and repeat the process until the file is totally consumed.
how can achive this? Can i achive this with Audio/Video API html5? how can i define the size of the chunk that i want consume?
You should not have to do anything - all major modern browsers will support range requests automatically if you use the tag, for example, and request the file in 'chunks', and if your server supports range requests as you say.
You can see it in action by using the developer tools in your browser and observing the network tab.
I'm not aware, however, of a way to specify the chunk size yourself - if that is really important to you, you could use the Media Source Extension mechanism which essentially allows you handle the download, and any manipulation you want to do to the streamed file, before you pass it to the browser video player.
One thing to be aware of - for mp4 files, the header need to be at the start of the file to enable you to stream it. By default it is at the end so you need to move it - there are many tools which will allow you do this. See here for example:
http://multimedia.cx/eggs/improving-qt-faststart/
I want to play mp3 file in my website using html.
Basically the mp3 file will be from remote server. As soon as user click play button, I dont want to download(stream) whole mp3 file and then play.
Instead of that I want to stream small chunks to be downloaded from server using nodejs and sent them to browser, browser will play that part.
meanwhile node js will continue stream the other parts.
so basiacally I am looking for youtube kind of implementation for audio where song will play for whatever part is streamed or downloaded
So anyone can explain me how can I achieve this in node js ? do I have to use socket.io ?
So I need to allow clients to record audio to the internet. Best solution I've found so far which keeps them on our site is the Soundcloud API and I just give them our account details and they use a version of the Soundcloud recorder (Flash/Javascript) hacked together for out site.
Main issue is, these recordings are long. Maybe up to an hour. Because Soundcloud API records in the browser then uploads the audio and transcodes on its server there's a lot to go wrong in the upload process which takes a long time with a big file. It's ok if we just record the audio in say 20 min chunks but it's just not that reliable.
We tried to build our own using wami-recorder, but that meant transcoding from wav to MP3 in the browser before upload to make the file smaller, but taking more time on the client machine - at least Soundcloud does the transcoding server-side.
Given the size of the files should I be looking at a server side recorder/streaming solution based on red5 or something, or is a client side recorder with upload a better proposal?
thanks a lot!