Before you say it can't be done please take a look at my train of thought and entertain me.
I have read on stackoverflow that it can't be done and how to implement this using ffmpeg and other stuff on the server side which is great and simpleish enough to comprehend .. ive even used an extensiion to Video.js i found on github that makes this one step easier. But none the less what if I dont have a copy of the <video src=... > and I really dont care to get one?
I Do not want to use a server to do this Okay with that out of the way, I understand thanks to a post from Paul Irish that video playback is not a shared aspect of web-kit ports (the code which powers basically every browser ... minus chrome canary now using blink a webkit fork) This kinda makes sense why certain browsers only support certain video containers.
So for the sake of simplicity: I want to make this functionality only available on Chrome and only MPEG-4 AVC video containers, why can't this be done if some how I can actually view each frame of the video while its playedback?
additional note
So the generating of video thumbnails is possible using by drawing frames to a canvas, this will only be part of a final solution to my problem, I'm looking to do this each and everytime a video is viewed not store images on my server after a first playback is completed by a user. What I would like to eventually work up to is generating a thumbnail as the video is downloaded that can be viewed while a user uses a dragging scrollbar to ff/rw to a point in the video. So this will need to be done as frames of video come available, not once they have been rendered by the browser for user to view
One can actually feed in a video to the canvas, as seen here in HTML5Doctor. Basically, the line that does the magic is:
canvasContext.drawImage(videoElement,0,0,width,height);
Then you can run a timer that periodically retrieves the frames from the canvas. There are 2 options on this one
get raw pixel data
get the base64 encoded data
As for saving, send the data to the server to reconstruct an image using that data, and save to disk. I also suggest you size your canvas and video to the size you want your screenshots to be since the video-canvas transfer automatically manages scaling.
Of course, this is limited by the video formats that are supported by the browser. As well as support for canvas and video.
Generating thumbnails during first render? You'd run into problems with that since:
You can't generate all frames unless it's rendered on the video element.
Suppose you have generated thumbnails during first run and want to use them for further runs. Base64 data is very long, usually 3 times the file size if the image. Raw pixel data array is width x height x 4 in length. The most viable storage candidate is localStorage, which is just 5-10MB depending on the browser.
No way to cache the images generated into the browser cache (there could be a cache hack that I don't know using data-urls).
I suggest you do it on the server instead. It's too much burden and hassle to do in the client side.
Related
I have a problem rendering images with large file sizes. When I upload a lot of large file sizes of images and display them, it becomes very laggy.
My question is:
The ideal way is the backend should provide an very small image file size url?
Or the frontend can do it using Canvas API?
Could you provide an explanation please? Thank you
If you have a bunch of thumbnails to display, the source images for those thumbnails should definitely not be the original, full-size images. If those images are large, it will take a long time for the client to download them, no matter how you render them on the client.
When figuring out an image to be shown to client, you should have two versions on the server:
Thumbnail version - low resolution, small file size, easy to download and render many at once
Full-size version, downloaded when the user wants to zoom in on one of them
It could be that the full-size version should not necessarily be the original image. For example, if the original image is 20MB (yes, they can exist), you wouldn't want to burden clients with that. So you'd want to perform resizes and optimizations on the server for both the thumbnail version and the "full" version - enough such that there isn't much of a delay between when the client tries to zoom in and the full-size image fully loads.
My recommendation is that you convert the image type to something more performant like .webp or .jpeg and give the element the exact width and height properties.
And react also have a feature to lazy load your images using react.lazy() that can boost your web performance significantly
Handling large images is too much work for a frontend client. You should get these images at a smaller size in order to just show them as they are.
Imagine someone trying to use your application with an old computer, or even an old smartphone, you shouldn't rely on client's processing power to handle all this heavy work.
I hope my answer helps you!
Jpegs are what you should be using, look at functionPreload() as well
Background:
I'm working on a video project with 50+ short videos (10min, 720p) that I want to present online. My current architecture is to placing 16 video tags in a 4x4 grid, and then randomly setting their source on load using JavaScript, and on click zooming a video to cover the full screen until clicked again.
The problem:
Each video in 720p webm is around 80mb. With 16 videos that is 1.3GB totally, or 130MB per minute, or 2MB per second. Which is a ridiculous amount of data, I think, maybe I'm wrong. The each video is so big (80mb) is to support the zoom-full-screen feature.
My idea for a solution:
Each video in two resolutions, and use the low resolution for the grid layout, and the higher resolution on the click-to-zoom.
My question: How to make this smooth? Can I preload the high resolution video on click in the background at the position of the low resolution video? And make the shift in the CSS transform? Or is there a better way to do this?
Secondary question: How to host this online? Can I put the videos on vimeo maybe? Right now I'm using wordpress.com hosting.
The normal way to achieve something like that is to encode the video using an adaptive bitrate format. The two primary formats for that would either be HLS or MPEG-DASH. Most online encoding platforms can provide those as outputs. Normally you would encode 5-6 different qualities (this helps with users that are on wifi, where bandwidth might constantly be changing) but you could easily encode it in just two different qualities.
Normally the players would be able to select the right quality automatically, but you can manage that yourself if you want.
If you are going to use HLS, you can use hls.js and its Quality Switch API. For MPEG-DASH, a good player to use would be Shaka Player and then set it like this:
player.configure({enableAdaption: false});
player.selectVideoTrack(trackId);
If you want to switch specifically on fullscreen, just listen for the fullscreen events on the players.
I am building a small app that allows to add css3 filters like grayscale in a video and download it. The video won't be longer than 6 seconds. So, I am first loading the video in the canvas and then applying the filter that user demands. Now I want user to be able to download the filtered video. The canvas.toDataURL() is only meant for images. Is there any high level canvas api to achieve this?
Thank you
Not that I know of. I think this is something that should be done server-side. Either send the raw video to the server and tell it what filters were applied so you can re-create the effect on the server OR use the solution proposed here capturing html5 canvas output as video or swf or png sequence? (hint: it's also server side)
Question
I want to write efficient code for playing/manipulating images in a web browser using JavaScript. I think using a video file may cut down http requests. How can I access a png/jpg/bytecode version of a single frame from a video?
Background
Currently, I have a sequence of ~1000 images, which vary ever-so-slightly, that need to be quickly accessible on my page. loading the images via HTTP requests is taking forever(obviously), and as my app grows, it is likely that this number will grow from 1000 to 5000 to 10,000 ...
Ajax requests for individual images will not work, b/c I need the image to load immediately (and don't have time to wait for a new http request).
My idea was to pre-process a video file on the server which shows the image progression- one image per frame-to speed up the rate of download and the browser's performance. I feel like this video could download to the client quickly based on the speed of watching videos online. I'm getting stuck on how to get picture content for a frame out of the video.
HTML5?
Note, I haven't looked into HTML5 yet, but would be willing to consider if it may help.
You can draw video frames to html5 canvas.
I created this fiddle by combining this and this.
Key point is getting the current video frame and drawing it into the canvas element.
var delay=20;
function draw(cvideo,ccanvas,canvas_width,canvas_height) {
if(cvideo.paused || cvideo.ended) return false;
ccanvas.drawImage(cvideo,0,0,canvas_width,canvas_height);
setTimeout(draw,delay,cvideo,ccanvas,canvas_width,canvas_height);
}
After getting it into canvas you can do almost anything with the image.
Instead of using a video file you could use image sprites - basically combine multiple images into a single big image, of which you only always show the appropriate region (assuming all images have the same dimensions this would be easy) - this would largely reduce the number of necessary HTTP requests and speed up the loading process in turn.
Can any of you help me to be able to show pictures as fast as facebook does!
Facebook is incredible to watch pictures at, because the pictures are kind of preloaded I think.
Often ved you view galleries on other sites, it is a pain in the a**, because it is so slow every time you change picture.
I think you need javascript to do it!?
Depending on your implementation, you could do this with some ajax and hidden dom elements.
Suppose you have a gallery with a slideshow.. You could insert a hidden dom element with the picture next picture of the slide show for each load. This would cause the image to be loaded. If you then were to use JS to insert that same image tag later, the browser would rely on it's cache rather than fetching it form the server since it already has that photo.
This is kind of a broad question but I think this approach would work. You would probally be better off not reinventing the wheel and seeing what Image prefetch librarbies based on JQuery or whatever are available to you..
Facebook compresses images to extremes. Try it yourself, take an image you are having trouble with and upload it to Facebook. Then check the size of the image, you will know why. Once I did a small test by uploading 17429 bytes image and it compressed it to 18757 bytes, a complete 7% increase from the original size!
At that compressed size, you can implement some sort of prefetch next image for display. Along with, I think, they have extremely good infrastructure.
Facebook uses Bigpipe, there is an open implementation in the works called openpipe
Bigpipe pushes the content to the browser when server stopped processing, so user will notice that it is faster.
It basically loads pagelets, when they are ready for the user, at the browser the implementation is Javascript based, and you must push the info to the client with your preferred server language.
First of all, facebook heavilycompresses images. Many other websites don't. Facebook also has a faster network than most other websites.
With the small image size, the client can prefetch the next image.
Preloaded would mean loading when the page is loaded, which is what happens with an <img> tag. No, it's simply because the file size is smaller.
If your wanting images to be viewed quicker on your site first make sure the images are decently compressed and aren't any bigger than they have to be. The amount of times I have seen websites using an extremely large image scaled down to fit in an element 5 times smaller is just ridiculous.
You can check out these sites that has many implementations and links on how to pre-load / pre-fetch images (css, JavaScript, ajax)
http://perishablepress.com/press/2009/12/28/3-ways-preload-images-css-javascript-ajax/
Since your question was tagged with 'jquery' here is one just for that.
http://engineeredweb.com/blog/09/12/preloading-images-jquery-and-javascript