I have VLC streming server, on which I started two streams:
vlc -vvv -d http://*camera_adress* --sout '#transcode{vcodec=theo,vb=800,acodec=vorb,ab=128}:standard{access=http,mux=ogg,dst=*server_name*:20000}'
vlc -vvv -d http://*camera_adress* --sout '#standard{access=http,mux=mpjpeg,dst=*server_name*:21000}'
1) Ogg with HTML5 works fine, I am receiving stream from video tag.
2) Mjpg on mobile it works fine, but I don't know how to get MJPG frames in html5. I tried to use JavaScript from http://wiki.ros.org/mjpegcanvasjs/Tutorials/CreatingASingleStreamCanvas but it doesn't work. VLC Media Player receives stream, so this is not the server or stream problem.
Any help?
OK, I get it. It's better to start stream like this:
vlc -d -vvv http://camera_ip/mjpg/video.mjpg --no-audio --sout '#transcode{vcodec=MJPG,venc=ffmpeg{strict=1}}:standard{access=http{mime=multipart/x-mixed-replace;boundary=--7b3cc56e5f51db803f790dad720ed50a},mux=mpjpeg,dst=server_name:port}' &
Then U can get this stream as a normal jpg in HTML5, using simple tag:
<img src>
The accepted answer should still be fine but today, manual MIME type specification is no longer necessary if the output file name is given with a mpjpeg extension:
vlc -I dummy "file.mp4" --no-audio --sout=#transcode{vcodec=MJPG,height=720,fps=4,vb=800}:http{mux=mpjpeg,dst=:8080/video.mpjpeg}
Related
I am having an issue with streaming a video in Safari for the iPad and Google Chrome browsers on Android devices. The application is working fine on my laptop that is using a Microsoft Edge Chromium browser. I have looked for examples to solve my problem, but I can't find anything specific to this. The issue is that the client will send a get request to retrieve a video file. I use gridfs-stream to retrieve that file from a MongoDb database, and then a pipe the file to the response object. This allows the user to view the video stream in a video-js player. This works with no issues on laptop and desktop devices with Microsoft Edge Chromium browsers, but it is failing in Safari and Mobile Chrome browsers. Here is my code:
I respond to the request and retrieve the file from the database, then pipe it to the response object in node:
app.get('/get-video', (req, res, next) => {
let video = req.query.video;
gfs.files.findOne({ filename: video }, (err, file) => {
// Check if file ex
if (!file || file.length === 0) {
return res.status(404).json({
err: 'No file exists'
});
}
else {
let readstream = gfs.createReadStream(file.filename);
readstream.pipe(res);
}
});
});
As you can say, I find the file based on the video name in the get request, create a ReadStream, and stream the file to the response object. Here is the html code in which the get request is made from the client. The player is utilizing the video-js library, but I couldn't find any compatibility issues with creating streams, AND this does work in the Edge/Chromium browser:
<video class="video-js vjs-big-play-centered vjs-16-9 card-img-top" autopreload="false" data-setup='{"autoplay": false, "controls": true}'>
<source src="/get-video?video=<%= post.source %>">
</video>
I am making the source for the video the response sent from the get request, which is the video stream. In the Safari browser I get the error: The media playback was aborted due to a corruption problem or because the media used features your browser did not support
I do not have this issue in any browsers when sending images via the get request. I also don't have any issues when I simply request static files. The issue seems to be contained to specific browsers when I attempt to send a video stream.
The reason why I am using the stream and not a static file is because I am hosting the application in the cloud. I have to send the raw file from the MongoDb database (or at least that's my understanding). When I was testing the application before sending it to the cloud this issue did not occur because I could simply utilize the file system and store the file path as the source for the video. However, the file system is not persistent with Heroku application, so I am using a cloud database in this situation. Now that I need to stream it from a database the issues are occurring.
use custom iframe concept
<!DOCTYPE html>
<html>
<body>
<h1>The iframe element</h1>
<iframe src="https://www.w3schools.com">
<p>Your browser does not support iframes.</p>
</iframe>
</body>
</html>
To play your video on a web page, do the following:
Upload the video to YouTube
Take a note of the video id
Define an element in your web page
Let the src attribute point to the video URL
Use the width and height attributes to specify the dimension of the player
Add any other parameters to the URL (see below)
<!DOCTYPE html>
<html>
<body>
<iframe width="420" height="345" src="https://www.youtube.com/embed/tgbNymZ7vqY">
</iframe>
</body>
</html>
I'm trying to display a continuous video stream (live-stream) in a browser.
Description:
My client reported a video stream doesn't work in the Chrome browser. I thought it will be an easy case, I even tried to write a demo, to prove streaming should be available with just HTML5 native video tag:
https://github.com/mishaszu/streaming-video-demo
No problems with random video but:
the particular video stream on client-side doesn't work.
With html code:
<video id="video-block" width="320" height="200" autoplay>
<source src="url/to/my/video" type="video/mp4">
</video>
it shows loader for a while and dies.
What I know about the stream:
1. Codec used: H264-MPEG-4 AVC (part 10) (avc1)
2. It's a live stream, not a file, so I can't use command like MP4Box from a terminal with it
3. Because it's live stream it probably doesn't have "end of file"
4. I know it's not broken because VLC is able to display it
5. I tried native HTML 5 video tag with all Media type strings (just in case to check all codecs for mp4)
As I mentioned trying different mime types didn't help, I also tried to use MediaSource but I am really not sure how to use it with a live stream, as all information I found made assumptions:
a) waiting for resolve promise and then appends buffer
b) adding the event listener for updateend to appends buffer
I think in the case of a live stream it won't work.
Conclusion:
I found a lot of information about how a streamed file might contain metadata (at the beginning of the file or at the end)... and I ended up with a conclusion that maybe I do not fully understand what's going on.
Questions:
What's the proper way to handle the mp4 live stream?
If a native HTML video tag should support the live stream, how to debug it?
I thought that maybe I should look for something like HLS but for mp4 format?
I've went through the same - I needed to mux an incoming live stream from rtsp to HTML5 video, and sadly this may become non-trivial.
So, for a live stream you need a fragmented mp4 (check this SO question if you do not know what that is:). The is the isobmff specification, which sets rules on what boxes should be present in the stream. From my experience though browsers have their own quirks (had to debug chrome/firefox to find them) when it comes to a live stream. Chrome has chrome://media-internals/ tab, which shows the errors for all loaded players - this can help debugging as well.
So my shortlist to solve this would be:
1) If you say that VLC plays the stream, open the Messages window in VLC ( Tools -> Messages ), set severity to debug, and you should see the mp4 box info in there as the stream comes in, verify that moof boxes are present
2a) Load the stream in chrome, open chrome://media-internals/ in a new tab and inspect errors
2b) Chrome uses ffmpeg underneath, so you could try playing the stream with ffplay as well and check for any errors.
2c) You are actually incorrect about mp4box - you could simply load a number of starting bytes from the stream, save to a file and use mp4box or other tools on that (at worst it should complain about some corrupted boxes at the end if you cut a box short)
If none of 2a/2b/2c provide any relevant error info that you can fix yourself, update the question with the outputs from these, so that others have more info.
I'm trying create a SourceBuffer from the W3 Media Source Extension API with Mime type 'audio/wav' like so:
let sourceBuffer = mediaSource.addSourceBuffer('audio/wav');
However I get a "NotSupportedError":
Failed to execute 'addSourceBuffer' on 'MediaSource': The type provided ('audio/wav') is unsupported.
Also, running the following:
MediaSource.isTypeSupported('audio/wav');
in the browser console returns false for both recent versions of firefox and chrome.
If I just set the src of the audio tag to the url of the .wav, everything works fine. It's only when I use a SourceBuffer that I get file type support issues. What DOMString do I need to specify to addSourceBuffer() to have it accept a PCM encoded .wav file?
I'm using Chrome 72 and firefox 68
Unfortunately, media supported by audio/video elements is not always supported by MSE. This is the case for audio/wav.
See also: https://github.com/w3c/media-source/issues/55
In this case, you could decode the WAV file in your own script, and use the ScriptProcessorNode in the Web Audio API to play it back. A hacky mess for sure, but possible!
I'm having issues seeking in video's using Chrome.
For some reason, no matter what I do, video.seekable.end(0) is always 0.
When I call video.currentTime = 5, followed by console.log(video.currentTime), I see it is always 0, which seems to reset the video.
I've tried both MP4 and VP9-based webm formats, but both gave the same results.
What's more annoying is that Firefox runs everything perfectly. Is there something special about Chrome that I should know about?
Here's my code (which only works in Firefox):
<div class="myvideo">
<video width="500" height="300" id="video1" preload="auto">
<source src="data/video1.webm" type="video/webm"/>
Your browser does not support videos.
</video>
</div>
And here's the javascript:
var videoDiv = $(".myvideo").children().get(0)
videoDiv.load();
videoDiv.addEventListener("loadeddata", function(){
console.log("loaded");
console.log(videoDiv.seekable.end(0)); //Why is this always 0 in Chrome, but not Firefox?
videoDiv.currentTime = 5;
console.log(videoDiv.currentTime); //Why is this also always 0 in Chrome, but not Firefox?
});
Note that simply calling videoDiv.play() does actually correctly play the video in both browsers.
Also, after the movie file is fully loaded, the videoDiv.buffered.end(0) also gives correct values in both browsers.
It took me a while to figure it out...
The problem turned out to be server-side. I was using a version of Jetty to serve all my video-files. The simple configuration of Jetty did not support byte serving.
The difference between Firefox and Chrome is that Firefox will download the entire video file so that you can seek through it, even if the server does not support http code 206 (partial content). Chrome on the other hand refuses to download the entire file (unless it is really small, like around 2-3mb).
So to get the currentTime parameter of html5 video to be working in Chrome, you need a server that supports http code 206.
For anyone else having this problem, you can double check your server config with curl:
curl -H Range:bytes=16- -I http://localhost:8080/GOPR0001.mp4
This should return code 206. If it returns code 200, Chrome will not be able to seek the video, but Firefox will, due to a workaround in the browser.
And a final tip: You can use npm http-server to get a simple http-server for a local folder that supports partial content:
npm install http-server -g
And run it to serve a local folder:
http-server -p 8000
Work around if modifying server code is unfeasible. Make an API call for you video, then load the blob into URL.createObjectURL and feed that into the src attribute of your video html tag. This will load the entire file and then chrome will know the size of the file allowing seeking capabilities to work.
axios.get(`${url}`, {
responseType: "blob"
})
.then(function(response) {
setVideo(URL.createObjectURL(response.data));
})
.catch(function(error) {
// handle error
console.log(error);
});
If you wait for canplaythrough instead of loadeddata, it works.
See this codepen example.
You have 3 possibilities for the Video tag: MP4, OGG, WebM.
Not all formats work in all browsers.
Here, I'm thinking that WebM works in Firefox but not Chrome, so you should supply both alternative formats for MP4 and WebM files, by including a 2nd Source tag referring to the MP4 file.
E.g. src="data/video1.mp4" type="video/mp4"
The relevant version will be automatically selected by the browser.
I had a similar problem. I was listening for an the end event on a video and setting currentTime to the middle of the video to loop it continuously. It wasn't working in Safari or Chrome.
I think there may be a bug in Safari/Chrome where playhead position properties aren't available unless the media is currently playing.
My workaround was to start my loop just before the video end and not letting it actually end.
Try testing yours by starting playback first and then run your fuction to see if it works in Safari Chrome.
I'm trying to play an mp3 file and I want to jump to specific location in the file. In Chrome 33 on Windows, the file jumps the correct position (as compared with VLC playing the mp3 locally) but in Firefox 28 on Windows it plays too far forward and in Internet Explorer 11 it plays too far behind.
It used to work correctly in Firefox 27 and earlier.
Is there a better way of doing this?
EDIT: The problem doesn't even require SoundManager2. You can replicate the same issue with just the <audio> tag in Firefox. These two lines are all the code you need to reproduce it:
<audio autoplay id="audio" src="http://ivdemo.chaseits.co.uk/enron/20050204-4026(7550490)a.mp3" controls preload></audio>
<button onclick="javascript:document.getElementById('audio').currentTime = 10;">Jump to 10 secs "...be with us in, er, 1 minute... ok" </button>
Try it here: http://jsfiddle.net/cpickard/29Gt3/
EDIT: Tried with Firefox Nightly, no improvement. I have reported it as bug 994561 in bugzilla. Still looking for a workaround for now.
The problem lies in the VBR encoding of the mp3.
Download that mp3 to disk and convert it to fixed bit rate, say with Audacity.
Run the example from disk:
<audio autoplay id="audio" src="./converted.mp3" controls preload></audio>
<button onclick="javascript:document.getElementById('audio').currentTime = 10;">
Jump to 10 secs "...be with us in, er, 1 minute... ok" </button>
and it works fine for me.
So my suggestion for workaround is is to upload an alternative fixed-bit mp3 file in place of the one you are using. Then it should work in the current FFx.
I work on SoundJS and while implementing audio sprites recently ran into similar issues. According to the spec, setting the position of html audio playhead can be inaccurate by up to 300ms. So that could explain some of the issues you are seeing.
Interestingly, your fiddle plays correctly for me in FF 28 on win 8.1 if I just let it play through from the start.
There are also some known issues with audio length accuracy that may also have an effect, which you can read about here.
If you want precision, I would definitely recommend using Web Audio where possible or a library like SoundJS.
Hope that helps.
I met the same issue, and I solved it by converting my MP3 file to the CBR(Constant Bit Rate) format. Then, it can solve the inconsistent issue between the currentTime and the real sound.
Choose the CBR format
Steps:
Download and install "Audacity" (it's a free for any platform)
Open your MP3 file
Click [File] -> [Export] -> [Options] -> [Constant] (See: Converting MP3 to Constant Bit Rate)
Audacity will ask you to provide the LAME MP3 encoder
(See: [download and install the LAME MP3 encoder])
There will be no inconsistent/asynchronous issue.
Also see:
HTML5 audio starts from the wrong position in Firefox
Inconsistent seeking in HTML5 Audio player
tsungjung411#gmail.com
I just tried your code with another audio url here, it seemed to work and i did not experience a delay of any sort in Firefox( v29) which i did previously.
<audio autoplay id="audio" src="http://mediaelementjs.com/media/AirReview-Landmarks-02-ChasingCorporate.mp3" controls preload></audio>
I guess to jump around an audio file, your server must be configured properly.
The client sends byte range requests to seek and play certain regions of a file, so the server must response adequately:
In order to support seeking and playing back regions of the media that
aren't yet downloaded, Gecko uses HTTP 1.1 byte-range requests to
retrieve the media from the seek target position. In addition, if you
don't serve X-Content-Duration headers, Gecko uses byte-range requests
to seek to the end of the media (assuming you serve the Content-Length
header) in order to determine the duration of the media.
Hope this helps..
You could also try looking into Web Audio API for sound-effect-like playback which gives you some guarantees about the playback delays.
After testing the fiddle it is noticable that there is some issue with FF , anywho , after searching sometime , the issue is due to "Performance lag" , but the good news is that someone has found a solution to that issue , you may want to read this :
http://lowlag.alienbill.com/
a single script will solve it all.