Im playing a video that weights 31.6MB on a website. The video plays alright on the latest versions of chrome in desktop, but on mobile the frames seems to lag behind the audio, even after looping the video a couple of times, so it should have been loaded completely already.
My best guess is that it's because of the weight of the video. Having played a much smaller file (3MB) and checking that the images was in sync with the audio kind of confirms it. But the fact that I preload the video, and that even if I make it play through more than once the problems persists (after the first play through there shouldn't be anything else to load), makes me believe is something else.
Im leaving the code that I use to preload the file just in case its needed.
HTML
<video id="videoId" src="public/videos/0620_ShakeHandam_VP9.webm" playsinline muted loop="true"></video>
JS
document.addEventListener("DOMContentLoaded", function(event) {
const video = document.getElementById("videoId")
const url = video.dataset.src;
const xhr = new XMLHttpRequest();
const loadingStartedDate = Date.now();
document.body.classList.add("loading");
xhr.open("GET", url, true);
xhr.responseType = "arraybuffer";
xhr.onload = function(oEvent) {
const blob = new Blob([oEvent.target.response], {type: "video/yourvideosmimmetype"});
const loadingTime = Date.now() - loadingStartedDate;
video.src = URL.createObjectURL(blob);
alert(`VIDEO LOADED: ${loadingTime}ms`);
console.log(`Video loaded after ${loadingTime}ms`);
document.body.classList.remove("loading");
document.body.classList.add("loaded");
//video.play() if you want it to play on load
};
xhr.onprogress = function(oEvent) {
if (oEvent.lengthComputable) {
const percentComplete = oEvent.loaded/oEvent.total * 100;
console.log("PROGRESS", percentComplete);
document.getElementById("load-percentage").textContent = `${percentComplete.toFixed(2)}%`;
// do something with this
}
}
xhr.send();
});
EDIT #1
The video in question has a transparent background. After further testing, I believe that it may be the cause of the problem. It doesn't seems to be happening with videos without a transparent background (mp4 or webm)
Related
First of all, hello everyone.
I need to archive videos on Crunchyroll for a project, but no matter how much I reverse engineer, I can't find the main source file.
First of all, i have Blob sourced player like that.
<video id="player0" playsinline="" src="blob:https://static.crunchyroll.com/3740...db01b2" style="display: flex; width: 100%; height: 100%;"></video>
The first problem starts with the fact that the video is streamed instead of being sent directly.
So this solution doesn't work for this case.
<a href="blob:https://static.crunchyroll.com/3740...db01b2" download>Download</a>
After that I realized that Crunchyroll has developed even stronger protection than YouTube because on YouTube I could get the source video by playing with the range parameter.
Then I tried to pull the content with javascript, but I still couldn't get a result.
var xhr = new XMLHttpRequest;
xhr.responseType = 'blob';
xhr.onload = function () {
var recoveredBlob = xhr.response;
var reader = new FileReader;
reader.onload = function () {
var BlobAsDataURL = reader.result;
window.location = BlobAsDataURL;
}
reader.readAsDataURL(recoveredBlob);
}
xhr.open('GET', 'blob:https://static.crunchyroll.com/893...2960');
xhr.send();
When I try to use it, I get either the Cross-Origin error or the file not available error when I try it on the Crunchyroll page.
Then I thought of trying to stream it via VLC player. But when I came to the Network tab, I saw that the broadcast was made in an extremely complex way, not in m3u8 format, so it rotted without trying.
Does anyone know what I can do?
Is it possible to make HTML5 video tag playing fragments as soon and as many times as I append them to SourceBuffer? So that, for example, if I append fragment2 - I would like it to immediately play fragment2 even if there were no fragment1, and if I append fragment2 again - I would it to play fragment2 again.
I've already tried to pass MediaSource an initial segment + fragment2 without fragment1, but what I've found is that instead of immediately playing it it just fills up a corresponding part of timeline with it. So that if I seek to corresponding place - it will start playing it, but if I wouldn't - it will not (unless I also pass fragment1, ofk). To say nothing that even if I append fragment2 to SourceBuffer again, it will not automatically play it again until I seek.
Here is a way that I'm using to produce a MP4/DASH stream:
$ MP4Box -dash 30000 -dash-profile on-demand -segment-name output-seg output.mp4
$ cat output-seginit.mp4 output-seg2.m4s >test.mp4
Here is a corresponding HTML5/JavaScript code:
<audio id="player" controls></audio>
<script>
var player = document.querySelector("#player");
var mediaSource = new MediaSource();
player.src = URL.createObjectURL(mediaSource);
mediaSource.addEventListener('sourceopen', function() {
console.log('sourceopen');
var sourceBuffer = mediaSource.addSourceBuffer('video/mp4; codecs="mp4a.40.2"'); // My video only contains audio stream
var xhr = new XMLHttpRequest()
xhr.responseType = "arraybuffer";
xhr.onload = function(event) {
console.log("appendBuffer");
sourceBuffer.appendBuffer(event.target.response);
}
xhr.open('GET', 'test.mp4');
xhr.send(null);
});
</script>
Is it possible to do what I want? Also, it would be appreciated if somebody could suggest me a proper word how to call this - I'd refer it as "live streaming", but I'm not sure.
I am using the Webaudio api's "createMediaElementSource" which works fine on Firefox(Gecko) and Chrome(Blink) but not Safari(Webkit). This is a big problem for me since I prefer getting the audio from my Html5 audio players rather than using XMLHttpRequests due to the latter being too slow.
The first attempt I did was to get the source as a string from the audio tag and serve it as an url in an XMLHttpRequest. As expected it works but the decoding is very slow and I cant pause the audio with stop() as a resume induces another round of prior decoding of the entire file before it can be heared..
A stackoverflow user named Kevin Ennis gave me an important advice which is a really great idea:
You could break the audio up into a number of smaller files. Like,
maybe break it up into 4 separate 1MB audio files and load them in
order. Then you can start playback after the first one loads, and
while that's playing, you load the other ones.
My question is, how do I do this technically? I am not aware of any function that checks if an audio file finished.
I imagine it would look something like this:
var source = document.getElementByTagName["audio"][0].src;
var fileExt = source.indexOf('.');
var currentFile = 1;
if(decodeCurrentData == complete) {
currentFile += 1;
source = source.slice(0, fileExt) + "_part" + currentFile.toString() + ".mp3";
loadAudioFile();
}
var loadAudioFile = function () {
var request = new XMLHttpRequest();
request.open( "GET", "source", true );
request.responseType = "arraybuffer";
request.onload = function (){
context.decodeAudioData(request.response, function (buffer) {
convolver.buffer = buffer;
});
};
request.send();
};
loadAudioFile();
Will my idea work or would it utterly fail? What would you suggest I do about the long decoding time?
In Google Chrome:
One .wav file is played, looping. Another .wav file is played from time to time as a sound effect.
When the sound effect plays, the volume of the looping sound automatically decreases. The volume gradually increases again over about 15 seconds.
(I guess it's automatically ducking http://en.wikipedia.org/wiki/Ducking )
I don't want the volume of the loop to decrease when the sound effect plays. How can I prevent this behaviour?
Example: http://www.matthewgatland.com/games/takedown/play/web/audiofail.html
window.AudioContext = window.AudioContext||window.webkitAudioContext;
var context = new AudioContext();
var play = function (buffer, loop) {
var source = context.createBufferSource();
source.buffer = buffer;
if (loop) source.loop = true;
source.connect(context.destination);
source.start(0);
};
var load = function (url, callback) {
var request = new XMLHttpRequest();
request.open('GET', url, true);
request.responseType = 'arraybuffer';
request.onload = function() {
context.decodeAudioData(request.response, function(buffer) {
callback(buffer);
}, null);
};
request.send();
};
var musicSound;
var thudSound;
load("res/snd/music0.wav", function (buffer) {
musicSound = buffer;
});
load("res/snd/thud0.wav", function (buffer) {
thudSound = buffer;
});
Once the sounds have loaded, call:
play(musicSound, true); //start the music looping
//each time you call this, the music becomes quiet for a few seconds
play(thudSound, false);
You might have to do some sound design before you put this into your website. I don't know what you are using for an editor but you might want to edit the sounds together so that their over all level is closer to the level of the original looping sound. That way their won't be as dramatic a difference in levels that is triggering the automatic gain reduction. The combination of both sounds is too loud so the louder of the two will bring down the level of the softer one. So if you bring them closer together in level the overall difference shouldn't be as drastic when or if the gain reduction kicks in.
I'm struggling to fetch an HTML5 video using xhr2 and blob responseType with Chrome on Android 4.2. The code works perfectly on Chrome and Firefox desktop and on Firefox Android 4.2 (with FF desktop, I use a webm video instead of the mp4).
// Taking care of prefix
window.URL = window.URL || window.webkitURL;
// This function download the video
var loadVideo = function() {
var xhr = new XMLHttpRequest();
xhr.addEventListener('load', addVideoFile, false);
xhr.open('GET', "videos/myvideo.mp4" , true);
xhr.responseType = 'blob';
xhr.send();
};
// this function sets the video source
var addVideoFile = function() {
if(4 == this.readyState && 200 == this.status) {
var video = document.getElementById('vid'),
blob = this.response;
video.src = window.URL.createObjectURL(blob);
console.log('video ready');
}
};
loadVideo();
Can anyone explain me why this does not work with Chrome on Android? If I plug my phone to use the remote debugging, the console will display 'video ready', suggesting that the video was downloaded but it's impossible to play it, video is just a black screen.
Also, this code works if I use it to fetch images instead of video. Is there a limitation I'm not aware of, preventing to download Blob above a certain size? (My video is 1.5 MB).
Thanks you very much for your help!
This is most certainly a bug. If you get something that works on Desktop Chrome but not Android then 99.5% of the time it is an issue we need to fix.
I have replicated your issue http://jsbin.com/uyehun/1 and I have filed the bug too https://code.google.com/p/chromium/issues/detail?id=253465
Per http://caniuse.com/bloburls, for Android 4.0-4.3 you need to use window.webkitURL.createObjectUrl() instead of window.URL.createObjectUrl().
This will let you generate a blob url, though I haven't actually been able to get a video element to play such an url.