I am trying to add audio stream to html audio element with javascript audio API. the code below works perfectly with a source file on the server, but it will NOT add a stream. Starting a stream with html5 audio at html level works fine, but the rest of the snippet will not work. this is my javascript and html.
<!DOCTYPE HTML>
<html lang="en">
<head>
<style>
div#mp3_player{ width:500px; height:90px; background:#000; padding:5px; margin:50px auto; }
div#mp3_player > div > audio{ width:500px; background:#000; float:left; }
div#mp3_player > canvas{ width:500px; height:60px; background:#002D3C; float:left; }
</style>
<script>
// Create a new instance of an audio object and adjust some of its properties
var audio = new Audio();
audio.src = 'Five Finger Death Punch - Wrong Side Side of Heaven HD.mp3';
audio.controls = true;
audio.loop = true;
audio.autoplay = true;
// Establish all variables that your Analyser will use
var canvas, ctx, source, context, analyser, fbc_array, bars, bar_x, bar_width, bar_height;
// Initialize the MP3 player after the page loads all of its HTML into the window
window.addEventListener("load", initMp3Player, false);
function initMp3Player(){
document.getElementById('audio_box').appendChild(audio);
context = new webkitAudioContext(); // AudioContext object instance
analyser = context.createAnalyser(); // AnalyserNode method
canvas = document.getElementById('analyser_render');
ctx = canvas.getContext('2d');
// Re-route audio playback into the processing graph of the AudioContext
source = context.createMediaElementSource(audio);
source.connect(analyser);
analyser.connect(context.destination);
frameLooper();
}
// frameLooper() animates any style of graphics you wish to the audio frequency
// Looping at the default frame rate that the browser provides(approx. 60 FPS)
function frameLooper(){
window.webkitRequestAnimationFrame(frameLooper);
fbc_array = new Uint8Array(analyser.frequencyBinCount);
analyser.getByteFrequencyData(fbc_array);
ctx.clearRect(0, 0, canvas.width, canvas.height); // Clear the canvas
ctx.fillStyle = '#00CCFF'; // Color of the bars
bars = 100;
for (var i = 0; i < bars; i++) {
bar_x = i * 3;
bar_width = 2;
bar_height = -(fbc_array[i] / 2);
// fillRect( x, y, width, height ) // Explanation of the parameters below
ctx.fillRect(bar_x, canvas.height, bar_width, bar_height);
}
}
</script>
</head>
<body>
<div id="mp3_player">
<div id="audio_box"></div>
<canvas id="analyser_render"></canvas>
</div>
</body>
</html>
My problem is..... a file on the server works fine.... but a stream will not play. It appears to load, but i have no sound from the stream.
a demo of it working with a sorce file is at enter link description here
the stream i am trying to play is http://173.35.234.36:8000/stream
TY for any help that may b offered :)
As it turns out, your endpoint, http://173.35.234.36:8000/stream, does not support CORS. When requesting it from a different origin (i.e. different domain, port or protocol), the request will be denied by the browser for the following reason:
MediaElementAudioSource outputs zeroes due to CORS access restrictions for http://173.35.234.36:8000/stream
Related
is it possible to display a html5-video as part of the canvas?
basically the same way as you draw an Image in the canvas.
context.drawVideo(vid, 0, 0);
thanks!
var canvas = document.getElementById('canvas');
var ctx = canvas.getContext('2d');
var video = document.getElementById('video');
video.addEventListener('play', function () {
var $this = this; //cache
(function loop() {
if (!$this.paused && !$this.ended) {
ctx.drawImage($this, 0, 0);
setTimeout(loop, 1000 / 30); // drawing at 30fps
}
})();
}, 0);
I guess the above code is self Explanatory, If not drop a comment below, I will try to explain the above few lines of code
Edit :
here's an online example, just for you :)
Demo
var canvas = document.getElementById('canvas');
var ctx = canvas.getContext('2d');
var video = document.getElementById('video');
// set canvas size = video size when known
video.addEventListener('loadedmetadata', function() {
canvas.width = video.videoWidth;
canvas.height = video.videoHeight;
});
video.addEventListener('play', function() {
var $this = this; //cache
(function loop() {
if (!$this.paused && !$this.ended) {
ctx.drawImage($this, 0, 0);
setTimeout(loop, 1000 / 30); // drawing at 30fps
}
})();
}, 0);
<div id="theater">
<video id="video" src="http://upload.wikimedia.org/wikipedia/commons/7/79/Big_Buck_Bunny_small.ogv" controls></video>
<canvas id="canvas"></canvas>
<label>
<br />Try to play me :)</label>
<br />
</div>
Here's a solution that uses more modern syntax and is less verbose than the ones already provided:
const canvas = document.querySelector("canvas");
const ctx = canvas.getContext("2d");
const video = document.querySelector("video");
video.addEventListener("play", () => {
function step() {
ctx.drawImage(video, 0, 0, canvas.width, canvas.height);
requestAnimationFrame(step);
}
requestAnimationFrame(step);
});
Some useful links:
MDN Documentation for window.requestAnimationFrame()
Can I use requestAnimationFrame?
Using canvas to display Videos
Displaying a video is much the same as displaying an image. The minor differences are to do with onload events and the fact that you need to render the video every frame or you will only see one frame not the animated frames.
The demo below has some minor differences to the example. A mute function (under the video click mute/sound on to toggle sound) and some error checking to catch IE9+ and Edge if they don't have the correct drivers.
Keeping answers current.
The previous answers by user372551 is out of date (December 2010) and has a flaw in the rendering technique used. It uses the setTimeout and a rate of 33.333..ms which setTimeout will round down to 33ms this will cause the frames to be dropped every two seconds and may drop many more if the video frame rate is any higher than 30. Using setTimeout will also introduce video shearing created because setTimeout can not be synced to the display hardware.
There is currently no reliable method that can determine a videos frame rate unless you know the video frame rate in advance you should display it at the maximum display refresh rate possible on browsers. 60fps
The given top answer was for the time (6 years ago) the best solution as requestAnimationFrame was not widely supported (if at all) but requestAnimationFrame is now standard across the Major browsers and should be used instead of setTimeout to reduce or remove dropped frames, and to prevent shearing.
The example demo.
Loads a video and set it to loop. The video will not play until the you click on it. Clicking again will pause. There is a mute/sound on button under the video. The video is muted by default.
Note users of IE9+ and Edge. You may not be able to play the video format WebM as it needs additional drivers to play the videos. They can be found at tools.google.com Download IE9+ WebM support
// This code is from the example document on stackoverflow documentation. See HTML for link to the example.
// This code is almost identical to the example. Mute has been added and a media source. Also added some error handling in case the media load fails and a link to fix IE9+ and Edge support.
// Code by Blindman67.
// Original source has returns 404
// var mediaSource = "http://video.webmfiles.org/big-buck-bunny_trailer.webm";
// New source from wiki commons. Attribution in the leading credits.
var mediaSource = "http://upload.wikimedia.org/wikipedia/commons/7/79/Big_Buck_Bunny_small.ogv"
var muted = true;
var canvas = document.getElementById("myCanvas"); // get the canvas from the page
var ctx = canvas.getContext("2d");
var videoContainer; // object to hold video and associated info
var video = document.createElement("video"); // create a video element
video.src = mediaSource;
// the video will now begin to load.
// As some additional info is needed we will place the video in a
// containing object for convenience
video.autoPlay = false; // ensure that the video does not auto play
video.loop = true; // set the video to loop.
video.muted = muted;
videoContainer = { // we will add properties as needed
video : video,
ready : false,
};
// To handle errors. This is not part of the example at the moment. Just fixing for Edge that did not like the ogv format video
video.onerror = function(e){
document.body.removeChild(canvas);
document.body.innerHTML += "<h2>There is a problem loading the video</h2><br>";
document.body.innerHTML += "Users of IE9+ , the browser does not support WebM videos used by this demo";
document.body.innerHTML += "<br><a href='https://tools.google.com/dlpage/webmmf/'> Download IE9+ WebM support</a> from tools.google.com<br> this includes Edge and Windows 10";
}
video.oncanplay = readyToPlayVideo; // set the event to the play function that
// can be found below
function readyToPlayVideo(event){ // this is a referance to the video
// the video may not match the canvas size so find a scale to fit
videoContainer.scale = Math.min(
canvas.width / this.videoWidth,
canvas.height / this.videoHeight);
videoContainer.ready = true;
// the video can be played so hand it off to the display function
requestAnimationFrame(updateCanvas);
// add instruction
document.getElementById("playPause").textContent = "Click video to play/pause.";
document.querySelector(".mute").textContent = "Mute";
}
function updateCanvas(){
ctx.clearRect(0,0,canvas.width,canvas.height);
// only draw if loaded and ready
if(videoContainer !== undefined && videoContainer.ready){
// find the top left of the video on the canvas
video.muted = muted;
var scale = videoContainer.scale;
var vidH = videoContainer.video.videoHeight;
var vidW = videoContainer.video.videoWidth;
var top = canvas.height / 2 - (vidH /2 ) * scale;
var left = canvas.width / 2 - (vidW /2 ) * scale;
// now just draw the video the correct size
ctx.drawImage(videoContainer.video, left, top, vidW * scale, vidH * scale);
if(videoContainer.video.paused){ // if not playing show the paused screen
drawPayIcon();
}
}
// all done for display
// request the next frame in 1/60th of a second
requestAnimationFrame(updateCanvas);
}
function drawPayIcon(){
ctx.fillStyle = "black"; // darken display
ctx.globalAlpha = 0.5;
ctx.fillRect(0,0,canvas.width,canvas.height);
ctx.fillStyle = "#DDD"; // colour of play icon
ctx.globalAlpha = 0.75; // partly transparent
ctx.beginPath(); // create the path for the icon
var size = (canvas.height / 2) * 0.5; // the size of the icon
ctx.moveTo(canvas.width/2 + size/2, canvas.height / 2); // start at the pointy end
ctx.lineTo(canvas.width/2 - size/2, canvas.height / 2 + size);
ctx.lineTo(canvas.width/2 - size/2, canvas.height / 2 - size);
ctx.closePath();
ctx.fill();
ctx.globalAlpha = 1; // restore alpha
}
function playPauseClick(){
if(videoContainer !== undefined && videoContainer.ready){
if(videoContainer.video.paused){
videoContainer.video.play();
}else{
videoContainer.video.pause();
}
}
}
function videoMute(){
muted = !muted;
if(muted){
document.querySelector(".mute").textContent = "Mute";
}else{
document.querySelector(".mute").textContent= "Sound on";
}
}
// register the event
canvas.addEventListener("click",playPauseClick);
document.querySelector(".mute").addEventListener("click",videoMute)
body {
font :14px arial;
text-align : center;
background : #36A;
}
h2 {
color : white;
}
canvas {
border : 10px white solid;
cursor : pointer;
}
a {
color : #F93;
}
.mute {
cursor : pointer;
display: initial;
}
<h2>Basic Video & canvas example</h2>
<p>Code example from Stackoverflow Documentation HTML5-Canvas<br>
Basic loading and playing a video on the canvas</p>
<canvas id="myCanvas" width = "532" height ="300" ></canvas><br>
<h3><div id = "playPause">Loading content.</div></h3>
<div class="mute"></div><br>
<div style="font-size:small">Attribution in the leading credits.</div><br>
Canvas extras
Using the canvas to render video gives you additional options in regard to displaying and mixing in fx. The following image shows some of the FX you can get using the canvas. Using the 2D API gives a huge range of creative possibilities.
Image relating to answer Fade canvas video from greyscale to color
See video title in above demo for attribution of content in above inmage.
You need to update currentTime video element and then draw the frame in canvas. Don't init play() event on the video.
You can also use for ex. this plugin https://github.com/tstabla/stVideo
I started with the answer from 2018 about requestAnimationFrame. However, it has three problems.
unnecessarily complex
there is a new way to handle this task (since about 2020)
attaching an event listener to the play event invites performance issues
First, the event listener can simply be connected to the function that does your desired processing and schedules the next call to itself. There's no need to wrap it in an anonymous function with another call to requestAnimationFrame.
Second, don't use requestAnimationFrame. That function schedules the next call to your callback as quickly as the browser can handle (generally 60Hz), which results in a significant processing workload. video.requestVideoFrameCallback only calls your callback when the video proceeds to its next frame. This reduces the workload when a video runs at less than 60 FPS and obviates the need for any processing at all while the video isn't playing, significantly improving performance.
Third, an event listener attached to the play event will fire whenever you tell the video to video.play(), which you'll do every time you load a new video into the video element and tell it to start playing, and also when you resume playback after using video.pause(). Thus, the video is drawn on the canvas (plus whatever other processing you're doing) once for each time the video has been told to play(), which quickly accumulates.
If you're sure you'll only be playing one video which you'd like to pause and resume, you can toggle play/pause by changing the playback rate, e.g. video.playbackRate = !video.playbackRate. If you'll be loading multiple videos into this element, it's better to forego the play event listener entirely and insert a manual call to step() when you load your first video to get that started. Note that this is active on the video element, not on specific loaded videos, so you'll need to either set and check a flag to ensure that you only call step() when loading the first video, or cancel any active video frame request before making a new one (shown below).
let animation_handle;
const canvas = document.querySelector("canvas");
const ctx = canvas.getContext("2d");
const video = document.querySelector("video");
video.addEventListener('loadeddata', video_load_callback, false);
function video_load_callback() {
video.cancelVideoFrameCallback(animation_handle);
step()
}
function step() { // update the canvas when a video proceeds to next frame
ctx.drawImage(video, 0, 0, canvas.width, canvas.height);
animation_handle = video.requestVideoFrameCallback(step);
}
I've created a simple music player, which creates a bufferArray for a particular audio URL to play the music.
It is working fine in many of my cellphone's browser, so I guess there is no cross origin issue for audio URL.
however chrome is not playing audio.
Also I've created a Uint8Array for plotting frequency data inside canvas, while many browsers are plotting frequency graph in canvas successfully, chrome is not doing so!
Take a look at what I've tried so far!
```
<!DOCTYPE html>
<html>
<head>
<title>Page Title</title>
<meta name="viewport" content="width=device-width, initial-scale=1.0">
</head>
<body>
<center>
<h1>Music Player</h1>
<hr>
<div id="div"></div>
<canvas></canvas>
<p>Frequency plot</p>
</center>
<script>
url = "https://dl.dropbox.com/s/5jyylqps64nyoez/Legends%20never%20die.mp3?dl=0";
const div = document.querySelector("#div");
const cvs = document.querySelector("canvas");
cvs.width = window.innerWidth - 20;
cvs.height = 200;
const c = cvs.getContext("2d");
function loadMusic(url){
div.innerHTML = "Loading music, please wait...";
const context = new AudioContext();
const source = context.createBufferSource();
const analyser = context.createAnalyser();
let request = new XMLHttpRequest();
request.open("GET",url,true);
request.responseType = "arraybuffer";
request.onload = ()=>{
div.innerHTML = "Music loaded, please wait, music will be played soon...";
context.decodeAudioData(request.response,suffer=>{
source.buffer = suffer;
source.connect(context.destination);
source.connect(analyser);
analyser.connect(context.destination);
source.start();
div.innerHTML = "Music is playing... Enjoy!";
setInterval(()=>{
c.clearRect(0,0,cvs.width,cvs.height);
let array = new Uint8Array(analyser.frequencyBinCount);
analyser.getByteFrequencyData(array);
let m = 0;
for(m = 0; m < array.length; m++){
let x = (parseInt(window.innerWidth -20)*m)/array.length;
c.beginPath();
c.moveTo(x,150-((100*array[m])/255));
c.lineTo((parseInt(window.innerWidth -20)*(m+1))/array.length,150-((100*array[m+1])/255));
c.lineWidth = 1;
c.strokeStyle = "black";
c.stroke();
}
},1);
});
}
request.send();
}
loadMusic(url);
</script>
</body>
</html>
```
This is more a couple of observations than a complete solution.
The code given worked for me on Edge, Chrome and Firefox on Windows 10.
On IOS 14 Safari and IOS 14 Chrome it seemed to stop after putting out the loading message.
This MDN reference used a 'cross browser' method to create audiocontext so I added this line:
var AudioContext = window.AudioContext || window.webkitAudioContext;
before this line:
const context = new AudioContext();
[edit: have just confirmed at caniuse that -webkit prefix needed by Safari]
That seemed to do the trick in as much as the rest of the code was executed. However, there was no sound and it appeared the audio was not playing. The plot also showed just a single horizontal line.
Is this a manifestation of IOS's requirement that there must be some user interaction before audio will actually be played?
I'm pretty sure the audio was loaded as there was a noticeable pause at that point. I suspect that there will have to be a button which when clicked actually starts the playing.
Summary
The API I speak about
The context
The problem: Actual result + Expected result
What I already have tried
Clues to help you to help me
Minimal and Testable Example: Prerequisites and Steps to follow + Sources
The API I speak about
https://developer.mozilla.org/en-US/docs/Web/API/MediaRecorder
The context
I have a video element that I load & play several times with different src. Between two different videos, a transition occures and each video appears with a fade in. When a video is playing, I draw it in a canvas with the transition and the fade in.
Everything I draw in the canvas is recorded using the Web API MediaRecorder. Then I download the WEBM file corresponding to this recording.
The problem
Actual result
The resulting WEBM is jerky. When I drew in the canvas, the drawing was fluid. But the WEBM resulting from the recording of the canvas drawing is jerky.
Expected result
The WEBM resulting from the recording of the canvas drawing must be as fluid as the canvas drawing itself. If it's impossible to have exactly this result, then: the WEBM must be approximately as fluid as the canvas drawing itself in a way that we quite can't say it's jerky.
What I have already tried
First, I've tried to set a time slice of 1ms, of 16ms (corresponding to 60fps), of 100ms, then 1000, then 10000, etc. to the method start of the Media Recorder, but it didn't work.
Second, I've tried to call requestData every 16ms (in a simple JS timeoutlistener executed every 16ms), but it didn't work.
Clues to help you to help me
Perhaps I'm wrong but it would be possible that a material limit exist on my computer (I have a HP Spectre x360 that doesn't have graphic card, but just a little graphics chip), or a logical limit on my Chromium browser (I have Version 81.0.4044.138 on my Ubuntu 16.04 LTS).
If you can confirm this it would solve this question. If you can explain how to deal with this problem it would be very good (alternative solution to the Web API Media Recorder, or something else).
Minimal and Testable Example
Prerequisites and Steps to follow
Have at least 2 WEBM videos (which will be the input videos, remember: we want to output a new video that contains the canvas drawing, which itself contains these two input videos plus transitions and colors effects etc.)
Have a HTTP server and a file called "index.html" that you will open with Chromium v.81 e.g.. Copy/paste the following sources inside ; click on the "Start" button (you won't need to click on the "Stop" button) ; it will draw both videos on the canvas, with the transitions & colors effects, it will record the canvas drawing, and a "Download the final output video" will appear, allowing you to download the output video. You will see that it's jerky.
In the following sources, copy/paste the paths of your videos in the JS array called videos.
Sources
<html>
<head>
<title>Creating Final Video</title>
</head>
<body>
<button id="start">Start</button>
<button id="stop_recording">Stop recording</button>
<video id="video" width="320" height="240" controls>
<source type="video/mp4">
</video>
<canvas id="canvas" width=3200 height=1608></canvas>
<script src="https://ajax.googleapis.com/ajax/libs/jquery/3.5.1/jquery.min.js"></script>
<script>
var videos = []; // Populated by Selenium
var canvas = document.getElementById("canvas");
var ctx = canvas.getContext("2d");
var video = document.querySelector("video");
startRecording();
$('#start').click(function() {
showSomeMedia(0, 0);
});
function showSomeMedia(videos_counter) {
resetVideo();
if(videos_counter == videos.length) {
$('#stop_recording').click();
return;
} else {
setVideoSrc(videos_counter);
setVideoListener();
videos_counter++;
}
video.addEventListener('ended', () => {
showSomeMedia(videos_counter);
}, false);
}
function resetVideo() {
var clone = video.cloneNode(true);
video.remove();
video = clone;
}
function setVideoSrc(videos_counter) {
video.setAttribute("src", videos[videos_counter]);
video.load();
video.play();
}
function setVideoListener() {
var alpha = 0.1;
video.addEventListener('playing', () => {
function step() {
if(alpha < 1) {
ctx.globalAlpha = alpha;
}
ctx.drawImage(video, 0, 0, canvas.width, canvas.height)
requestAnimationFrame(step)
if(alpha < 1) {
alpha += 0.00001;
}
}
requestAnimationFrame(step);
}, false)
}
function startRecording() {
const chunks = [];
const stream = canvas.captureStream();
const rec = new MediaRecorder(stream);
rec.ondataavailable = e => chunks.push(e.data);
$('#stop_recording').click(function() {
rec.stop();
});
rec.onstop = e => exportVid(new Blob(chunks, {type: 'video/webm'}));
window.setTimeout(function() {
rec.requestData();
}, 1);
rec.start();
}
function exportVid(blob) {
const vid = document.createElement('video');
vid.src = URL.createObjectURL(blob);
vid.controls = true;
document.body.appendChild(vid);
const a = document.createElement('a');
a.download = 'my_final_output_video.webm';
a.href = vid.src;
a.textContent = 'Download the final output video';
document.body.appendChild(a);
}
</script>
</body>
</html>
The problem is solved by using my gaming laptop (RoG), wich has a good graphic card, able to record at higher frequency than my Spectre x360 development laptop.
is it possible to display a html5-video as part of the canvas?
basically the same way as you draw an Image in the canvas.
context.drawVideo(vid, 0, 0);
thanks!
var canvas = document.getElementById('canvas');
var ctx = canvas.getContext('2d');
var video = document.getElementById('video');
video.addEventListener('play', function () {
var $this = this; //cache
(function loop() {
if (!$this.paused && !$this.ended) {
ctx.drawImage($this, 0, 0);
setTimeout(loop, 1000 / 30); // drawing at 30fps
}
})();
}, 0);
I guess the above code is self Explanatory, If not drop a comment below, I will try to explain the above few lines of code
Edit :
here's an online example, just for you :)
Demo
var canvas = document.getElementById('canvas');
var ctx = canvas.getContext('2d');
var video = document.getElementById('video');
// set canvas size = video size when known
video.addEventListener('loadedmetadata', function() {
canvas.width = video.videoWidth;
canvas.height = video.videoHeight;
});
video.addEventListener('play', function() {
var $this = this; //cache
(function loop() {
if (!$this.paused && !$this.ended) {
ctx.drawImage($this, 0, 0);
setTimeout(loop, 1000 / 30); // drawing at 30fps
}
})();
}, 0);
<div id="theater">
<video id="video" src="http://upload.wikimedia.org/wikipedia/commons/7/79/Big_Buck_Bunny_small.ogv" controls></video>
<canvas id="canvas"></canvas>
<label>
<br />Try to play me :)</label>
<br />
</div>
Here's a solution that uses more modern syntax and is less verbose than the ones already provided:
const canvas = document.querySelector("canvas");
const ctx = canvas.getContext("2d");
const video = document.querySelector("video");
video.addEventListener("play", () => {
function step() {
ctx.drawImage(video, 0, 0, canvas.width, canvas.height);
requestAnimationFrame(step);
}
requestAnimationFrame(step);
});
Some useful links:
MDN Documentation for window.requestAnimationFrame()
Can I use requestAnimationFrame?
Using canvas to display Videos
Displaying a video is much the same as displaying an image. The minor differences are to do with onload events and the fact that you need to render the video every frame or you will only see one frame not the animated frames.
The demo below has some minor differences to the example. A mute function (under the video click mute/sound on to toggle sound) and some error checking to catch IE9+ and Edge if they don't have the correct drivers.
Keeping answers current.
The previous answers by user372551 is out of date (December 2010) and has a flaw in the rendering technique used. It uses the setTimeout and a rate of 33.333..ms which setTimeout will round down to 33ms this will cause the frames to be dropped every two seconds and may drop many more if the video frame rate is any higher than 30. Using setTimeout will also introduce video shearing created because setTimeout can not be synced to the display hardware.
There is currently no reliable method that can determine a videos frame rate unless you know the video frame rate in advance you should display it at the maximum display refresh rate possible on browsers. 60fps
The given top answer was for the time (6 years ago) the best solution as requestAnimationFrame was not widely supported (if at all) but requestAnimationFrame is now standard across the Major browsers and should be used instead of setTimeout to reduce or remove dropped frames, and to prevent shearing.
The example demo.
Loads a video and set it to loop. The video will not play until the you click on it. Clicking again will pause. There is a mute/sound on button under the video. The video is muted by default.
Note users of IE9+ and Edge. You may not be able to play the video format WebM as it needs additional drivers to play the videos. They can be found at tools.google.com Download IE9+ WebM support
// This code is from the example document on stackoverflow documentation. See HTML for link to the example.
// This code is almost identical to the example. Mute has been added and a media source. Also added some error handling in case the media load fails and a link to fix IE9+ and Edge support.
// Code by Blindman67.
// Original source has returns 404
// var mediaSource = "http://video.webmfiles.org/big-buck-bunny_trailer.webm";
// New source from wiki commons. Attribution in the leading credits.
var mediaSource = "http://upload.wikimedia.org/wikipedia/commons/7/79/Big_Buck_Bunny_small.ogv"
var muted = true;
var canvas = document.getElementById("myCanvas"); // get the canvas from the page
var ctx = canvas.getContext("2d");
var videoContainer; // object to hold video and associated info
var video = document.createElement("video"); // create a video element
video.src = mediaSource;
// the video will now begin to load.
// As some additional info is needed we will place the video in a
// containing object for convenience
video.autoPlay = false; // ensure that the video does not auto play
video.loop = true; // set the video to loop.
video.muted = muted;
videoContainer = { // we will add properties as needed
video : video,
ready : false,
};
// To handle errors. This is not part of the example at the moment. Just fixing for Edge that did not like the ogv format video
video.onerror = function(e){
document.body.removeChild(canvas);
document.body.innerHTML += "<h2>There is a problem loading the video</h2><br>";
document.body.innerHTML += "Users of IE9+ , the browser does not support WebM videos used by this demo";
document.body.innerHTML += "<br><a href='https://tools.google.com/dlpage/webmmf/'> Download IE9+ WebM support</a> from tools.google.com<br> this includes Edge and Windows 10";
}
video.oncanplay = readyToPlayVideo; // set the event to the play function that
// can be found below
function readyToPlayVideo(event){ // this is a referance to the video
// the video may not match the canvas size so find a scale to fit
videoContainer.scale = Math.min(
canvas.width / this.videoWidth,
canvas.height / this.videoHeight);
videoContainer.ready = true;
// the video can be played so hand it off to the display function
requestAnimationFrame(updateCanvas);
// add instruction
document.getElementById("playPause").textContent = "Click video to play/pause.";
document.querySelector(".mute").textContent = "Mute";
}
function updateCanvas(){
ctx.clearRect(0,0,canvas.width,canvas.height);
// only draw if loaded and ready
if(videoContainer !== undefined && videoContainer.ready){
// find the top left of the video on the canvas
video.muted = muted;
var scale = videoContainer.scale;
var vidH = videoContainer.video.videoHeight;
var vidW = videoContainer.video.videoWidth;
var top = canvas.height / 2 - (vidH /2 ) * scale;
var left = canvas.width / 2 - (vidW /2 ) * scale;
// now just draw the video the correct size
ctx.drawImage(videoContainer.video, left, top, vidW * scale, vidH * scale);
if(videoContainer.video.paused){ // if not playing show the paused screen
drawPayIcon();
}
}
// all done for display
// request the next frame in 1/60th of a second
requestAnimationFrame(updateCanvas);
}
function drawPayIcon(){
ctx.fillStyle = "black"; // darken display
ctx.globalAlpha = 0.5;
ctx.fillRect(0,0,canvas.width,canvas.height);
ctx.fillStyle = "#DDD"; // colour of play icon
ctx.globalAlpha = 0.75; // partly transparent
ctx.beginPath(); // create the path for the icon
var size = (canvas.height / 2) * 0.5; // the size of the icon
ctx.moveTo(canvas.width/2 + size/2, canvas.height / 2); // start at the pointy end
ctx.lineTo(canvas.width/2 - size/2, canvas.height / 2 + size);
ctx.lineTo(canvas.width/2 - size/2, canvas.height / 2 - size);
ctx.closePath();
ctx.fill();
ctx.globalAlpha = 1; // restore alpha
}
function playPauseClick(){
if(videoContainer !== undefined && videoContainer.ready){
if(videoContainer.video.paused){
videoContainer.video.play();
}else{
videoContainer.video.pause();
}
}
}
function videoMute(){
muted = !muted;
if(muted){
document.querySelector(".mute").textContent = "Mute";
}else{
document.querySelector(".mute").textContent= "Sound on";
}
}
// register the event
canvas.addEventListener("click",playPauseClick);
document.querySelector(".mute").addEventListener("click",videoMute)
body {
font :14px arial;
text-align : center;
background : #36A;
}
h2 {
color : white;
}
canvas {
border : 10px white solid;
cursor : pointer;
}
a {
color : #F93;
}
.mute {
cursor : pointer;
display: initial;
}
<h2>Basic Video & canvas example</h2>
<p>Code example from Stackoverflow Documentation HTML5-Canvas<br>
Basic loading and playing a video on the canvas</p>
<canvas id="myCanvas" width = "532" height ="300" ></canvas><br>
<h3><div id = "playPause">Loading content.</div></h3>
<div class="mute"></div><br>
<div style="font-size:small">Attribution in the leading credits.</div><br>
Canvas extras
Using the canvas to render video gives you additional options in regard to displaying and mixing in fx. The following image shows some of the FX you can get using the canvas. Using the 2D API gives a huge range of creative possibilities.
Image relating to answer Fade canvas video from greyscale to color
See video title in above demo for attribution of content in above inmage.
You need to update currentTime video element and then draw the frame in canvas. Don't init play() event on the video.
You can also use for ex. this plugin https://github.com/tstabla/stVideo
I started with the answer from 2018 about requestAnimationFrame. However, it has three problems.
unnecessarily complex
there is a new way to handle this task (since about 2020)
attaching an event listener to the play event invites performance issues
First, the event listener can simply be connected to the function that does your desired processing and schedules the next call to itself. There's no need to wrap it in an anonymous function with another call to requestAnimationFrame.
Second, don't use requestAnimationFrame. That function schedules the next call to your callback as quickly as the browser can handle (generally 60Hz), which results in a significant processing workload. video.requestVideoFrameCallback only calls your callback when the video proceeds to its next frame. This reduces the workload when a video runs at less than 60 FPS and obviates the need for any processing at all while the video isn't playing, significantly improving performance.
Third, an event listener attached to the play event will fire whenever you tell the video to video.play(), which you'll do every time you load a new video into the video element and tell it to start playing, and also when you resume playback after using video.pause(). Thus, the video is drawn on the canvas (plus whatever other processing you're doing) once for each time the video has been told to play(), which quickly accumulates.
If you're sure you'll only be playing one video which you'd like to pause and resume, you can toggle play/pause by changing the playback rate, e.g. video.playbackRate = !video.playbackRate. If you'll be loading multiple videos into this element, it's better to forego the play event listener entirely and insert a manual call to step() when you load your first video to get that started. Note that this is active on the video element, not on specific loaded videos, so you'll need to either set and check a flag to ensure that you only call step() when loading the first video, or cancel any active video frame request before making a new one (shown below).
let animation_handle;
const canvas = document.querySelector("canvas");
const ctx = canvas.getContext("2d");
const video = document.querySelector("video");
video.addEventListener('loadeddata', video_load_callback, false);
function video_load_callback() {
video.cancelVideoFrameCallback(animation_handle);
step()
}
function step() { // update the canvas when a video proceeds to next frame
ctx.drawImage(video, 0, 0, canvas.width, canvas.height);
animation_handle = video.requestVideoFrameCallback(step);
}
I'm trying to load an image from a video file, draw it on a canvas, and modify canvas pixels. I'm hosting this on a local web server, and it works fine on any desktop computer in my local network. However when I try to open the same page on Android Nexus 7 tablet (also on the local network) I get this error:
Failed to execute 'getImageData' on 'CanvasRenderingContext2D': The
canvas has been tainted by cross-origin data.
Here's my final test code:
<!DOCTYPE html>
<html>
<body>
<canvas id="mainCanvas" width="1024" height="576" style = "border:1px solid #d3d3d3;">
Your browser does not support the HTML5 canvas tag.
</canvas>
<video id="myVideo" width="0" height="0" >
<source src="video.mp4" type='video/mp4' crossOrigin="anonymous">
</video>
<script>
window.onload = function() {
var objectImage = document.getElementById('myVideo');
objectImage.crossOrigin = "Anonymous";
objectImage.onloadeddata = function() {
var canvas = document.getElementById("mainCanvas");
var ctx = canvas.getContext("2d");
canvas.addEventListener('click', function(event) {
objectImage.play();
setTimeout(function() {
ctx.drawImage(objectImage, 0, 0);
var width = objectImage.videoWidth;
var height = objectImage.videoHeight;
var imageData = ctx.getImageData(0, 0, width, height);
var pixels = imageData.data;
for (var i = 0; i < pixels.length; i = i + 4) pixels[i] = 0;
ctx.putImageData(imageData, 0, 0, 0, 0, width, height);
}, 3000);
});
}
}
</script>
</body>
</html>
As you can see my video file is on the same domain as the rest of the code, and I'm not running this HTML as file:// but as http:// from a web server. I can see the unmodified frame on the screen, just before it calls getImageData(), so I assume video playback worked fine.
I also tried drawing image files on the canvas instead of video, and in this case it seems to work fine, and I can call getImageData without a problem.
Does anyone have a working example of getImageData() after drawing a frame from a video file?