How to prefetch video in a react application? - javascript

I have a React app with a component that loads different videos depending on user input. There are only 4 or 5 small videos, so I'd like to pre-fetch all of them when the browser is inactive.
Within my component, I have:
<video src={this.props.video} type="video/mp4" />
In my index.html, I have a line in the head for a video:
<link rel="prefetch" as="video/mp4" href="link/to/my/video.mp4">
However, this doesn't work. Looking at the console, I can see that the video is fetched (with a 200 status) but not stored in the cache (size is 5 Mb for the response, 0 Mb for on disk). When I provide user input and the component needs to display that video, it is fetched again which takes a few seconds.
PS - The reason I'm not trying to use preload on the video element is because preload only works if the page you are looking at has the video in it. In my case, I want to load the videos even if they are not required for the current page.
Update: I made a pen where you can see that the video isn't pre-fetched despite the use of a link tag in the head.

In your situation, you can make an AJAX request and create blob URL from the response of that request.
You can see from my code pen
function playVideo() {
var video = document.getElementById('video')
if (video) {
video.play().then(_ => {
console.log('played!')
});
}
}
function onSuccess(url) {
console.log(url);
var video = document.createElement('VIDEO')
if (!video.src) {
video.id = 'video';
document.body.appendChild(video);
video.src = url
}
}
function onProgress() {
}
function onError() {
}
prefetch_file('https://raw.githubusercontent.com/FilePlayer/test/gh-pages/sw_360_lq.mp4', onSuccess, onProgress, onError)
function prefetch_file(url,
fetched_callback,
progress_callback,
error_callback) {
var xhr = new XMLHttpRequest();
xhr.open("GET", url, true);
xhr.responseType = "blob";
xhr.addEventListener("load", function () {
if (xhr.status === 200) {
var URL = window.URL || window.webkitURL;
var blob_url = URL.createObjectURL(xhr.response);
fetched_callback(blob_url);
} else {
error_callback();
}
}, false);
var prev_pc = 0;
xhr.addEventListener("progress", function (event) {
if (event.lengthComputable) {
var pc = Math.round((event.loaded / event.total) * 100);
if (pc != prev_pc) {
prev_pc = pc;
progress_callback(pc);
}
}
});
xhr.send();
}
The disadvantage of this approach is that it will not work if the video doesn't allow CORS for your site.

Is it really necessary to prefetch all the videos before loading the screen? Even though they are small videos, to preload information that not necessarily will be used is not the only option. Imagine if the user only watch one of the five videos, it means that 80% of all data preloaded was never used. An alternative would be showing a component while you're loading the video, like Netflix does when is searching for options.

Related

The element has no supported sources when assigning video later on execution

I understand that is a question that has been asked many times but after searching for a long time on "similar questions" none seemed to solve my problem. Here's my situation:
I need to play a video after the user clicks on a "Play Button". I can't use the video's location directly on src on the <video> tag.
<video id="videoPlayer" controls autoplay></video>
I also have a VideoPlayer class that handles some information, including the video element.
function VideoPlayer(root, play) {
this._play = play;
this._root = root;
this._root.style.display = "none";
}
Which is created with:
const videoPlayer = new VideoPlayer(document.getElementById("videoPlayer"), document.querySelector("#play"));
The second element being the play button.
I've also added to the prototype of VideoPlayer a play function:
VideoPlayer.prototype.play = function (url, immediate) {
const root = this._root;
const play = this._play;
root.style.visibility = "visible";
root.src = url;
play.style.display = null;
once(play, "click", function () {
root.play();
});
return new Promise(function (resolve, reject) {
function ok() {
play.style.display = "none";
root.style.display = null;
clear();
}
function fail(error) {
clear();
reject(error);
}
function clear() {
root.removeEventListener("play", ok);
root.removeEventListener("abort", fail);
root.removeEventListener("error", fail);
}
root.addEventListener("play", ok);
root.addEventListener("abort", fail);
root.addEventListener("error", fail);
once(root, "ended", function () {
root.style.display = "none";
resolve();
});
});
};
And, after I finally get the video's location, which is something like videos/video1.mp4 or videos/video2.mp4 I call the play function using:
const initialVideo = "videos/mov_bbb.mp4";
if (initialVideo.length > 0) {
videoPlayer.play(initialVideo);
}
Exactly after calling the function, an error in thrown. After using breaking points I've tracked to this line:
root.src = url;
The problem doesn't seem to happen with external videos, I've tested using this video from W3School: https://www.w3schools.com/html/mov_bbb.mp4 and then downloaded the video to use it locally. When it's external it works, when it's local, does not... throwing "The element has no supported sources".
I've tried to export the same video using different codecs and extensions. This problem only happens on Chrome, Mozilla is working just fine. I can access the file via localhost:port/videos/mov_bbb.mp4 but it does not play. I'm using static-server-advance to serve the files. But I've also added to a different server to test it.
My chrome version is: 83.0.4103.10 and I'm not using any extensions.
I'm unable to use php or node.

how to open html 5 video fullscreen if it was fullscreen before

I'm watching a series of videos on a website organised in a playlist. Each video is about 2 minutes long.
The website uses HTML 5 video player and it supports auto-play. That is each time a video ends, the next video is loaded and automatically played, which is great.
However, with Fullscreen, even if I fullscreened a video previously, when the next video loads in the playlist, the screen goes back to normal, and I have to click the fullscreen button again....
I've tried writing a simple javascript extension with Tampermonkey to load the video fullscreen automatically.
$(document).ready(function() {
function makefull() {
var vid = $('video')[0]
if (vid.requestFullscreen) {
vid.requestFullscreen();
} else if (vid.mozRequestFullScreen) {
vid.mozRequestFullScreen();
} else if (vid.webkitRequestFullscreen) {
vid.webkitRequestFullscreen();
}
//var vid = $('button.vjs-fullscreen-control').click();
}
makefull()
But I'm getting this error:
Failed to execute 'requestFullscreen' on 'Element': API can only be initiated by a user gesture.
It's extremely annoying to have to manually click fullscreen after each 2 min video. Is there a way I can achieve this in my own browser? I'm using Chrome.
If you can get the list of URL's then you can create your own playlist. The code cannot be accurately tested within a cross-origin <iframe>, for example at plnkr.co. The code can be tested at console at this very document. To test the code, you can use the variable urls at MediaFragmentRecorder and substitute "pause" event for "ended" event at .addEventListener().
If you have no control over the HTML or JavaScript used at the site not sure how to provide any code that will be able to solve the inquiry.
const video = document.createElement("video");
video.controls = true;
video.autoplay = true;
const urls = [
{
src: "/path/to/video/"
}, {
src: "/path/to/video/"
}
];
(async() => {
try {
video.requestFullscreen = video.requestFullscreen
|| video.mozRequestFullscreen
|| video.webkitRequestFullscreen;
let fullScreen = await video.requestFullscreen().catch(e => {throw e});
console.log(fullScreen);
} catch (e) {
console.error(e.message)
}
for (const {src} of urls) {
await new Promise(resolve => {
video.addEventListener("canplay", e => {
video.load();
video.play();
}, {
once: true
});
video.addEventListener("ended", resolve, {
once: true
});
video.src = src;
});
}
})();

Meteor DOMException: Unable to decode audio data

EDIT : I just created a new Meteor Project and it worked :D wow.But it still doesnt work on my core project..looks like i have different settings.
In my Meteor.js project i have 4 .mp3-files located in public/sounds/xyz.mp3.
I load these .mp3 with :
let soundRequest = new XMLHttpRequest();
soundRequest.open('GET', this._soundPath, true);
soundRequest.responseType = 'arraybuffer';
let $this = this;
soundRequest.onload = function () {
Core.getAudioContext().decodeAudioData(soundRequest.response, function (buffer) {
$this.source.buffer = buffer;
$this.source.loop = true;
$this.source.connect($this.panner);
});
};
soundRequest.send();
This WORKS on google Chrome, but when i build the app via meteor run android-device, i get the following error message : DOMException: Unable to decode audio data
I wonder if this is a bug because loading .png or .jpg works just fine in the mobile version. I have not installed any packages beside meteor add crosswalk but deinstalling this doesnt help either.
You shouldn't need to do a http request to get a local resource. You can just refer to a local url. On the Android device the path is different. See this code:
function getSound(file) {
var sound = "/sounds/"+file;
if (Meteor.isCordova) {
var s;
if (device.platform.toLowerCase() === "android") {
sfile = cordova.file.applicationDirectory.replace('file://', '') + 'www/application/app' + sound;
}
else {
sfile = cordova.file.applicationDirectory.replace('file://', '') + sound;
}
var s = new Media(
sfile,
function (success) {
console.log("Got sound "+file+" ok ("+sfile+")");
s.play();
},
function (err) {
console.log("Get sound "+file+" ("+sfile+") failed: "+err);
}
);
} else {
var a = new Audio(sound);
a.play();
}
}
On a device it loads the sound file asynchronously and then plays it. In the browser it just loads and plays it synchronously.
This web API is not supported on android device but works on chrome browser of android
Check browser specification in this link
https://developer.mozilla.org/en-US/docs/Web/API/AudioContext/decodeAudioData

Reset Vimeo player when finished

How can I reset an embed Vimeo video to how it was onload after it's done playing?
The Vimeo API offers an unload method
player.api("unload")
But it isn't working for non-flash players.
Using the Vimeo API, you can add an event for finish to trigger the reload. The Vimeo API includes a method unload(), but it isn't supported in HTML players. Instead, reset the URL in the iframe to return the video to it's original state.
HTML
<iframe src="//player.vimeo.com/video/77984632?api=1" id="video"></iframe>
JS
var iframe = document.getElementById("video"),
player = $f(iframe);
player.addEvent("ready", function() {
player.addEvent('finish', function() {
player.element.src = player.element.src;
});
});
unload() should now work properly across all players.
Variation of Steve Robbins solution, with Vimeo specific solution. You don't have to reach the end of the video, but anytime the user bails out, including clicking on a button:
Simple Javascript solution with Vimeo Library loaded:
https://player.vimeo.com/api/player.js
function ResetVideo()
{
var Field = "iframe-video"; // <iframe id=iframe-video
var iframe = document.getElementById(Field);
var bLoad = LoadVimeoLib(); // Is the Vimeo lib loaded
if(bLoad > 0)
{
var Videoplayer = new Vimeo.Player(iframe);
Videoplayer.pause(); // Pause the video and audio
Videoplayer.setCurrentTime(0); // Reset the video position
// Reset the video back to the iframe
VideoSrc = Videoplayer.element.src; // Save the video source
Videoplayer.element.src = ""; // Empty the source
Videoplayer.element.src = VideoSrc; // Reset the video source
}
}
function LoadVimeoLib()
{
if (typeof jQuery === 'undefined')
{
alert('no jquery installed');
return 0;
}
var scriptlen = jQuery('script[src="https://player.vimeo.com/api/player.js"]').length;
if (scriptlen == 0)
{
jQuery.ajax({
type: "GET",
url: "https://player.vimeo.com/api/player.js",
dataType: "script"
});
}
return 1;
}

Process and upload video using getUserMedia()

I am trying to upload video from the users using the media capture interface recently introduced into javascript. Regardless of all the difficulties in browser compatibility, I can't even begin to understand the process of saving the video captured by the users.
I was thinking that I could somehow use ajax to push the streamed video up to the server, but be that as it may, I'm not even sure if I am approaching the problem appropriately.
I included my code, which currently only streams under chrome and opera.
function hasUserMedia()
{
return !!(navigator.getUserMedia ||
navigator.webkitGetUserMedia ||
navigator.mozGetUserMedia ||
navigator.msGetUserMedia);
}
if(hasUserMedia())
{
var onFail = function(error)
{
alert("ERROR >> " + error);
};
var onPass = function(stream)
{
var video = document.querySelector('video');
video.src = window.URL.createObjectURL(stream);
video.onloadedmetadata = function(e)
{
//..what do I put here..?
};
}
navigator.webkitGetUserMedia({video:true, audio:true}, onPass, onFail);
}
else
{
alert("ERROR >> USERMEDIA NOT SUPPORTED");
}
function saveVideo()
{
var connection = new XMLPHttpRequest();
connection.onreadystatechange=function()
{
if(connection.readyState == 4 && connection.status == 200)
{
alert("Your streamed video has been saved..!");
}
}
//..what do I type here..?
connection.open("POST","savevideo.php",true);
connection.send();
}
Currently you can't open a WebRTC connection to a server (though someone may be working on it...). AFAIK the only method is to take a screenshot of each frame by capturing the video to a canvas, then sending each from to the server and compile the video there.
Check out:
https://webrtc-experiment.appspot.com/RecordRTC/
Also may be helpful:
http://www.html5rocks.com/en/tutorials/getusermedia/intro/

Categories

Resources