I'm dynamically loading and drawing the first frame little video elements to a bigger canvas. (when you rollover it plays them) It usually works (90% of the time) but randomly sometimes one or other of the videos will draw a black box signifying that the imagedata sampled from the video is empty.
How I'm doing this is to use the canplaythrough event on each of the videos to identify if the video is ready to be sampled but I'm wondering if there is a better event I should be using?
for instance:
myvid.addEventListener("canplaythrough",function()
{
//do the sampling now
});
but the above occasionally and seemingly randomly draws a blank?
Any ideas? I've also tried: onloadeddata and canplay but these were even less reliable
I feel reason of blank frame is,even a one second video also consists of lots of frames. So it may contain a blank frame in between. I feel the event canplaythrough is proper. You may modify your sampling logic.
myvid.addEventListener("canplaythrough",function()
{
//check the intensity of some pixels of sample if they are blank. If blank then raise canplaythrough event
//else do the sampling now
});
Related
I would like to freeze a frame of a <video /> and modify it in a <canvas />. However, I've found the result of the canvas "drawImage" function is always slightly darker than the original video. This is noticeable when the canvas is placed over top of the original video, which is necessary for my purposes. I don't understand why this is the case. Is there any way to get around it?
Here's an example: https://jsfiddle.net/atd8vbkg/
PS I tried mirroring the entire original video to a canvas, so the darkness would be consistent when I overlayed the frame. This worked for desktop, however the video to canvas mirroring in mobile safari was very slow/low FPS, and therefor not a viable solution.
I have some javascript that crunches some numbers and draws to an HTML5 canvas, creating an animation. When the canvas isn't visible, the expensive and continuous number crunching / drawing is pointless.
So, when the user can't see the canvas (e.g. because they switched tabs or minimized the browser or scrolled down or various other possibilities), I'd like to put a hold on the computation.
In javascript, what's a good way to pause a periodic timer when a canvas isn't visible to the user? (And resume the timer when the canvas is visible again.)
In your other-browser-tab-focused scenario, requestAnimationFrame is ideal -- it auto-pauses its loop on tab blur and auto-resumes on tab focus.
In your scroll-canvas-offscreen scenario you can test your canvas's .scrollTop (with & without .height) vs the window's .scrollTop (with & without height).
To get the benefit of both test, put the .scrollTop test inside the requestAnimationFrame loop.
I am currently working on a JavaScript(pure js) based game. The game contains 5 large sprite sheets(e.g. 2861 × 768 and 4096 × 4864). When the game starts, all 5 sprite sheets are preloaded to canvas elements. Three of those 5 sprites represent together one animation, where each sprite contains 75 frames. When one sprite ends with its animation, I hide it and display the next sprite. When the second sprite finishes animating, I hide it and display the third/last one.
When the second or third sprite is about to be displayed, a small delay of 0.5 s - 1 s happens. The image is being decoded.
It is not something that happens just the first time, it is something that happens always. And that animation repeats every 5 minutes, and the small delay happens always.
The reason why I'm using canvas elements for preloading is that I thought WebKit would just throw away decoded images for some time being unused and that the canvas element would prevent WebKit from deleting it from memory. But that does not work.
I've tried almost every optimization I'm aware of. I have even refactored all my CSS by removing descendant selectors etc.
The renderer I'm using to draw those animations is built by myself and it is working perfect, so that could not be the problem, since it is working very well in Firefox.
EDIT [2016/03/04]:
I made a mode with canvas and the result is even worse. It laggs a lot. And the delay remains the same. Only in NW, the problem does not persist in Chrome neither in Firefox.
Canvas mode - Lags:
Default(HTML) mode - Works perfect:
Codepen: My renderer http://codepen.io/anon/pen/JXPWXX
Note: If i hide those other elements with opacity:0.2 rather than opacity:0, the problem does not happen. But, I can not hide them like that since they remain still visible. They shouldn't be visible. If I add opacity:0.01 it is not visible and the problem does not happen in Chrome, but still persists in NW.
In NW, when I swtich from opacity:0.2 to opacity:1, an image decode is being processed. The same thing does not happen in Chrome browser.
I am using the following version:
nw.js v0.12.3
io.js v1.2.0
Chromium 41.0.2272.76
commit hash: 591068b-b48a69e-27b6800-459755a-2bdc251-1764a45
The three image sprites are 14.4MB, 14.9MB and 15.5MB size. Each sprite contains 75 frames.
Why could this be happening and how can I prevent it?
Try to switch to google-chrome directly since the new nw version is probably released 19.04.2016. After that NW will hopefully keep up with every Chromium release.
You should not have the same problems in Chrome.
Given that keeping Webkit thinking the image is still displayed makes the problem disappear (as your opacity experiment shows), I'd move it nearly completely out of the visible area, with only a single transparent row overlapping with the viewport (using overflow hidden).
Note that an unpacked 4000x4000 sprite sheet will use 64 Megabytes of RAM (4 bytes (=RGBA) per pixel), so perhaps it might be better to make sure the next image gets "warmed up" a bit ahead of time, without keeping all of them unpacked all the time?
I'd recommend using idata = ctx.getImageData(0, 0, canvas.width, canvas.height) to retrieve the data array from the canvases, then ctx.putImageData(idata, 0, 0) to switch between sprites, rather than hiding the canvases.
So i'm trying to stream video in segments without using the MediaSource extension. (Because not all browsers support MSE). Now i'm trying to do this by loading two video elements and play the next one at the right moment. But this has a very tiny delay between switching. I tried to keep checking the currenTime of the video and after a tiny fraction play the next video element. But this doesn't really work that well (audio overlaps, or delay)
Mind you that the video are preloaded and loaded from Blob storage. So the loading shouldn't delay the playback.
How can i make this (Or another solution without flash) play smoothly without using MediaSource extensions?
Your best bet is to use two video elements positioned one on top of the other. The first one should be playing the current part (or chunk) of the video. The second video element should be loaded with the blob that contains the next chunk and be paused. It should also be hidden (you can set display:'none' or z-index:-99999). And then when the first video element ends (the end event is dispatched), call the play() method of the second video element, show it, and hide the first one. Rinse and repeat.
This is what the LifemirrorPlayer does.
If the chunks of the stream are perfectly encoded and cut then this technique works. Often it doesn't. Common problems are:
Synchronizing the audio stream is the probably the hardest. After twenty or thirty chunks are played, the audio usually loses sync with the video. This is very annoying for the viewer and hard to detect and fix (i don't know of any solution actually).
The video element doesn't end. This is usually caused by an encoder that has put too much (or too little) content into a chunk. It can also be caused by improperly cut chunks, by a buggy encoder or decoder.
Flickering when the two video elements are swapped. This depends on the browser. However most browsers deal very well with this and the swap is quick and smooth.
I am having problems determining when a clip has fully downloaded in Flowplayer. In the project i am working it is important there are no buffering pauses during the playback so i must be shure that the clip is fully loaded/downloaded.
The onLoad event fires when the player is loaded (not the clip) so not good.
Any idea if there is such event or how my application can know when the clip has fully downloaded?
Thank you
There's only onBufferFull when the clip buffer is filled completely.
You might want to try adding a second clip to your playlist after the main video and then add a onNetStreamEvent or onMetaData to that clip so you will know when the second clip received network data and therefore begins loading.
http://flowplayer.org/documentation/events/clip.html
To make sure there are no buffering pauses you might also consider to use a higher Buffer value, let's say something aroung the entire clip length?! There's a property called bufferLength for that:
http://flowplayer.org/documentation/configuration/clips.html