I am having an issue with streaming a video in Safari for the iPad and Google Chrome browsers on Android devices. The application is working fine on my laptop that is using a Microsoft Edge Chromium browser. I have looked for examples to solve my problem, but I can't find anything specific to this. The issue is that the client will send a get request to retrieve a video file. I use gridfs-stream to retrieve that file from a MongoDb database, and then a pipe the file to the response object. This allows the user to view the video stream in a video-js player. This works with no issues on laptop and desktop devices with Microsoft Edge Chromium browsers, but it is failing in Safari and Mobile Chrome browsers. Here is my code:
I respond to the request and retrieve the file from the database, then pipe it to the response object in node:
app.get('/get-video', (req, res, next) => {
let video = req.query.video;
gfs.files.findOne({ filename: video }, (err, file) => {
// Check if file ex
if (!file || file.length === 0) {
return res.status(404).json({
err: 'No file exists'
});
}
else {
let readstream = gfs.createReadStream(file.filename);
readstream.pipe(res);
}
});
});
As you can say, I find the file based on the video name in the get request, create a ReadStream, and stream the file to the response object. Here is the html code in which the get request is made from the client. The player is utilizing the video-js library, but I couldn't find any compatibility issues with creating streams, AND this does work in the Edge/Chromium browser:
<video class="video-js vjs-big-play-centered vjs-16-9 card-img-top" autopreload="false" data-setup='{"autoplay": false, "controls": true}'>
<source src="/get-video?video=<%= post.source %>">
</video>
I am making the source for the video the response sent from the get request, which is the video stream. In the Safari browser I get the error: The media playback was aborted due to a corruption problem or because the media used features your browser did not support
I do not have this issue in any browsers when sending images via the get request. I also don't have any issues when I simply request static files. The issue seems to be contained to specific browsers when I attempt to send a video stream.
The reason why I am using the stream and not a static file is because I am hosting the application in the cloud. I have to send the raw file from the MongoDb database (or at least that's my understanding). When I was testing the application before sending it to the cloud this issue did not occur because I could simply utilize the file system and store the file path as the source for the video. However, the file system is not persistent with Heroku application, so I am using a cloud database in this situation. Now that I need to stream it from a database the issues are occurring.
use custom iframe concept
<!DOCTYPE html>
<html>
<body>
<h1>The iframe element</h1>
<iframe src="https://www.w3schools.com">
<p>Your browser does not support iframes.</p>
</iframe>
</body>
</html>
To play your video on a web page, do the following:
Upload the video to YouTube
Take a note of the video id
Define an element in your web page
Let the src attribute point to the video URL
Use the width and height attributes to specify the dimension of the player
Add any other parameters to the URL (see below)
<!DOCTYPE html>
<html>
<body>
<iframe width="420" height="345" src="https://www.youtube.com/embed/tgbNymZ7vqY">
</iframe>
</body>
</html>
Related
I'm trying to display a continuous video stream (live-stream) in a browser.
Description:
My client reported a video stream doesn't work in the Chrome browser. I thought it will be an easy case, I even tried to write a demo, to prove streaming should be available with just HTML5 native video tag:
https://github.com/mishaszu/streaming-video-demo
No problems with random video but:
the particular video stream on client-side doesn't work.
With html code:
<video id="video-block" width="320" height="200" autoplay>
<source src="url/to/my/video" type="video/mp4">
</video>
it shows loader for a while and dies.
What I know about the stream:
1. Codec used: H264-MPEG-4 AVC (part 10) (avc1)
2. It's a live stream, not a file, so I can't use command like MP4Box from a terminal with it
3. Because it's live stream it probably doesn't have "end of file"
4. I know it's not broken because VLC is able to display it
5. I tried native HTML 5 video tag with all Media type strings (just in case to check all codecs for mp4)
As I mentioned trying different mime types didn't help, I also tried to use MediaSource but I am really not sure how to use it with a live stream, as all information I found made assumptions:
a) waiting for resolve promise and then appends buffer
b) adding the event listener for updateend to appends buffer
I think in the case of a live stream it won't work.
Conclusion:
I found a lot of information about how a streamed file might contain metadata (at the beginning of the file or at the end)... and I ended up with a conclusion that maybe I do not fully understand what's going on.
Questions:
What's the proper way to handle the mp4 live stream?
If a native HTML video tag should support the live stream, how to debug it?
I thought that maybe I should look for something like HLS but for mp4 format?
I've went through the same - I needed to mux an incoming live stream from rtsp to HTML5 video, and sadly this may become non-trivial.
So, for a live stream you need a fragmented mp4 (check this SO question if you do not know what that is:). The is the isobmff specification, which sets rules on what boxes should be present in the stream. From my experience though browsers have their own quirks (had to debug chrome/firefox to find them) when it comes to a live stream. Chrome has chrome://media-internals/ tab, which shows the errors for all loaded players - this can help debugging as well.
So my shortlist to solve this would be:
1) If you say that VLC plays the stream, open the Messages window in VLC ( Tools -> Messages ), set severity to debug, and you should see the mp4 box info in there as the stream comes in, verify that moof boxes are present
2a) Load the stream in chrome, open chrome://media-internals/ in a new tab and inspect errors
2b) Chrome uses ffmpeg underneath, so you could try playing the stream with ffplay as well and check for any errors.
2c) You are actually incorrect about mp4box - you could simply load a number of starting bytes from the stream, save to a file and use mp4box or other tools on that (at worst it should complain about some corrupted boxes at the end if you cut a box short)
If none of 2a/2b/2c provide any relevant error info that you can fix yourself, update the question with the outputs from these, so that others have more info.
In Safari (11), a static audio file loaded into a src via html or javascript works, albeit with the limitations of requiring user input before playing.
ex.
<audio src="static.mp3">
or
var audio = new Audio('static.mp3');
audio.play();
work fine.
However, I need to load audio files from the database, so I was using a controller action like so:
public FileContentResult GetAudio(int audioId)
{
if (!DbContext.Audios.Any(a => a.Id == audioId))
{
return null;
}
var audio = DbContext.Audios.Single(a => a.Id == audioId);
return File(audio.File, "audio/mp3");
}
and set like
<audio src="getaudio?audioId=1">
or
var audio = new Audio('getaudio?audioId=1');
it will not play in MacOS (Safari) or iOS, but works fine in Chrome and Edge (except on iOS). Depending on how I configure things, I get some form of Unhandled Promise error. I've also tried loading into a Web Audio buffer, with the same exact success and failures.
Does anyone have a solution or workaround to load my audio files on Safari?
EDIT
Ok, so on further testing, I discovered that it's not so much whether the files were sent via action or static file, it's how they were saved to the database in the first place. I'm now working to figure out why files I save (as byte[]) and then reload are not recognized by Safari.
OK, so it turns out, I was making the recordings with MediaRecorder, which is a fairly new feature in Chrome and a few other browsers. It didn't matter what format I told it to save as, because only webm is supported. And guess who doesn't support webm format? Safari. Any other browser was picking it up fine, regardless of what incorrect extension I put on the file.
When I find a webm to m4a conversion, I will add it here. I'm sure there are some solutions out there.
I am using Recorder.js to record something with the Audio API and I want to give the user the opportunity to listen to their recording before uploading it to the server. The recording code is heavily based on the supplied example, but instead of exporting to WAV for downloading I am using a blob and setting the src attribute of an audio element for playback. The responsible code (called after stopping the recording) looks like this:
function createAudioPlayer() {
recorder && recorder.exportWAV(function (blob) {
var url = URL.createObjectURL(blob);
var au = document.getElementById('recording');
au.src = url;
});
}
It results in something like this:
<audio id="recording" src="blob:https%3A//example.com/499ca96f-37b2-4515-bd1b-3c298f201dbd" controls="controls"></audio>
Playing back this audio works in Chrome 50 and Firefox 46 on OS X 10.11, and in Edge 13 on Windows 10. Chrome 50 on Android 6.0 records and uploads fine, but refuses playback with the following error when play() is called on the audio element:
Uncaught (in promise) DOMException: The element has no supported sources.
I've tried adding type="audio/wav" but that didn't work. If I upload the file from blob (with the File API and XHR) and use a src attribute with a 'normal' URL of the wav file on the server it also plays back fine. What should be changed so that users can listen to their recording before uploading?
Not yet supported. We've got work in progress that should fix it, hopefully in Chrome 51. (https://groups.google.com/a/chromium.org/forum/#!topic/chromium-reviews/Qi4dLcKjcCM)
I'm having issues seeking in video's using Chrome.
For some reason, no matter what I do, video.seekable.end(0) is always 0.
When I call video.currentTime = 5, followed by console.log(video.currentTime), I see it is always 0, which seems to reset the video.
I've tried both MP4 and VP9-based webm formats, but both gave the same results.
What's more annoying is that Firefox runs everything perfectly. Is there something special about Chrome that I should know about?
Here's my code (which only works in Firefox):
<div class="myvideo">
<video width="500" height="300" id="video1" preload="auto">
<source src="data/video1.webm" type="video/webm"/>
Your browser does not support videos.
</video>
</div>
And here's the javascript:
var videoDiv = $(".myvideo").children().get(0)
videoDiv.load();
videoDiv.addEventListener("loadeddata", function(){
console.log("loaded");
console.log(videoDiv.seekable.end(0)); //Why is this always 0 in Chrome, but not Firefox?
videoDiv.currentTime = 5;
console.log(videoDiv.currentTime); //Why is this also always 0 in Chrome, but not Firefox?
});
Note that simply calling videoDiv.play() does actually correctly play the video in both browsers.
Also, after the movie file is fully loaded, the videoDiv.buffered.end(0) also gives correct values in both browsers.
It took me a while to figure it out...
The problem turned out to be server-side. I was using a version of Jetty to serve all my video-files. The simple configuration of Jetty did not support byte serving.
The difference between Firefox and Chrome is that Firefox will download the entire video file so that you can seek through it, even if the server does not support http code 206 (partial content). Chrome on the other hand refuses to download the entire file (unless it is really small, like around 2-3mb).
So to get the currentTime parameter of html5 video to be working in Chrome, you need a server that supports http code 206.
For anyone else having this problem, you can double check your server config with curl:
curl -H Range:bytes=16- -I http://localhost:8080/GOPR0001.mp4
This should return code 206. If it returns code 200, Chrome will not be able to seek the video, but Firefox will, due to a workaround in the browser.
And a final tip: You can use npm http-server to get a simple http-server for a local folder that supports partial content:
npm install http-server -g
And run it to serve a local folder:
http-server -p 8000
Work around if modifying server code is unfeasible. Make an API call for you video, then load the blob into URL.createObjectURL and feed that into the src attribute of your video html tag. This will load the entire file and then chrome will know the size of the file allowing seeking capabilities to work.
axios.get(`${url}`, {
responseType: "blob"
})
.then(function(response) {
setVideo(URL.createObjectURL(response.data));
})
.catch(function(error) {
// handle error
console.log(error);
});
If you wait for canplaythrough instead of loadeddata, it works.
See this codepen example.
You have 3 possibilities for the Video tag: MP4, OGG, WebM.
Not all formats work in all browsers.
Here, I'm thinking that WebM works in Firefox but not Chrome, so you should supply both alternative formats for MP4 and WebM files, by including a 2nd Source tag referring to the MP4 file.
E.g. src="data/video1.mp4" type="video/mp4"
The relevant version will be automatically selected by the browser.
I had a similar problem. I was listening for an the end event on a video and setting currentTime to the middle of the video to loop it continuously. It wasn't working in Safari or Chrome.
I think there may be a bug in Safari/Chrome where playhead position properties aren't available unless the media is currently playing.
My workaround was to start my loop just before the video end and not letting it actually end.
Try testing yours by starting playback first and then run your fuction to see if it works in Safari Chrome.
I have VLC streming server, on which I started two streams:
vlc -vvv -d http://*camera_adress* --sout '#transcode{vcodec=theo,vb=800,acodec=vorb,ab=128}:standard{access=http,mux=ogg,dst=*server_name*:20000}'
vlc -vvv -d http://*camera_adress* --sout '#standard{access=http,mux=mpjpeg,dst=*server_name*:21000}'
1) Ogg with HTML5 works fine, I am receiving stream from video tag.
2) Mjpg on mobile it works fine, but I don't know how to get MJPG frames in html5. I tried to use JavaScript from http://wiki.ros.org/mjpegcanvasjs/Tutorials/CreatingASingleStreamCanvas but it doesn't work. VLC Media Player receives stream, so this is not the server or stream problem.
Any help?
OK, I get it. It's better to start stream like this:
vlc -d -vvv http://camera_ip/mjpg/video.mjpg --no-audio --sout '#transcode{vcodec=MJPG,venc=ffmpeg{strict=1}}:standard{access=http{mime=multipart/x-mixed-replace;boundary=--7b3cc56e5f51db803f790dad720ed50a},mux=mpjpeg,dst=server_name:port}' &
Then U can get this stream as a normal jpg in HTML5, using simple tag:
<img src>
The accepted answer should still be fine but today, manual MIME type specification is no longer necessary if the output file name is given with a mpjpeg extension:
vlc -I dummy "file.mp4" --no-audio --sout=#transcode{vcodec=MJPG,height=720,fps=4,vb=800}:http{mux=mpjpeg,dst=:8080/video.mpjpeg}