I'm working on a project where several videos are added to the HTML dynamically by JavaScript when they're needed to play, sort of like a remotely-controlled presentation. The order the videos will need to play is unknown at the time of page load and each video is under four seconds. However, the videos are large and naturally need to appear seamless when moving from when to the next, but since the videos are so big and, in some cases, loaded through the Internet, loading usually isn't seamless.
The way I'm adding videos right now is something like this:
$("body").append("<video class='onscreen' autoplay><source src='"+c.video+"'></source></video>");
Where c.video is a relative path to the video source.
Is there any way to preload videos directly from JavaScript beforehand without adding to the HTML? If not, what would be the best way to add to the HTML without having the video visibly playing and allowing for it to be dynamically added later?
I've seen several similar questions on StackOverflow (like this one) but they're all concerned with adding the video and only playing it later, once loaded. I'm sure it's possible to adapt the code presented in those, but I'm not quite sure how to go about doing it, especially as I don't understand some of the concept (please feel free to expand on your answer a bit; I don't just want code snippets!).
If it effects things at all, this will, in the foreseeable future, only have to run on Firefox and the videos will likely all be OGV (though there may be exceptions).
Related
My webpage uses a lot (1000+) gifs and displays different gifs using 'if' and 'getElementbyId', would it run quicker if I dropped the if(map(x,y)!=0){... to change the scr and just used a gif which was transparent?
Googled to heck but it's a hard question to word, if a same/similar has been posted please point me there.
Many Thanks
First you said about getElementById which is a part of javascript - A client side scripting language, so that is not a big reason to be worried about.
You said 1000+ .gif images.
Use concept of Sprites in this case
This will allow you to download all your images in one go.
This will also reduce some of downloading burden from your server.
To work with sprite technology, Visit this link, I believe this is what you need.
Create an Object Pool with Images and load them on the initialization step. Load them in batches and extend pool with new images if required.
While loading images show spinner. It will be loaded long for a first time, after that a browser will put them in cache.
If you have an opportunity to combine different little images in one (sprite), do it! because it is faster to load one big image, then a lot of little.
I am trying to create a feature where a user can change (back and forth) between multiple videos while maintaining a single consistent audio. Think of being able to watch a concert from multiple angles but listening to a single audio. The trouble I am having with this feature is that there can not be a lag between the changes in video or the audio will no longer sync with the videos (especially true after multiple changes).
I have tried two methods, both using html5 only (I would prefer not use flash although I will eventually have a fallback) that have not worked seamlessly, although depending on the browser and hardware, it can come very close.
Basic Methods:
Method 1: Preloading all videos and changing the video src path on each click using javascript
Method 2: Again preloading video and using multiple tags and changing between them using javascript on each click.
Is there anyway to get either of these two methods to work seamlessly without a gap? Should I be using a slight of hand trick, like playing both videos concurrently for a second before revealing the second and stoping the first? Can this just not be done with html5 players? Can it be done with flash?
I have seen this type of question a couple of times with both video and audio with no clear solution, but they were a couple of months old and I was hoping there is now a solution. Thanks for the help.
Worth adding that it is possible with the MediaSource API proposed by Google. This API allows you to feed arbitrary binary data to a single video element, thus if you have your video split into chunks you can fetch those chunks via XHR and append them to your video element, they'll be played without gaps.
Currently it's implemented only in Chrome and you need to enable Enable Media Source API on <video> elements in chrome:flags to use it. Also, only WebM container is currently supported.
Here is an article on HTML5Rocks that demonstrates how the API works: "Stream" video using the MediaSource API.
Another useful article that talks about chunked playlist: Segmenting WebM Video and the MediaSource API.
I hope this implementation gets adopted and gets wider media container support.
UPDATE JUN 2014 Browser support is slowly getting better: (thanks #Hugh Guiney for the tip)
Chrome Stable
FF 25+ has a flag media.mediasource.enabled [MDN]
IE 11+ on Windows 8.1 [MSDN]
Did you find a better way to do that?
I implemented a double-buffered playback using two video tags.
One is used for the current playback, and the second for preloading the next video.
When the video ends I "swap" the tags:
function visualSwap() {
video.volume = video2.volume;
video2.volume = 0;
video.style.width = '300px';
video2.style.width = '0px';
}
It has some non-deterministic behavior, so I am not 100% satisfied, but it's worth trying...
Changing the SRC tag is fast, but not gapless. I'm trying to find the best method for a media player I'm creating and preloading the next track and switching the src via "ended" leaves a gap of about 10-20ms, which may sound tiny, but it's enough to be noticable, especially with music.
I've just tested using a second audio element which fires off as soon as the first audio element ends via the event 'ended' and that incurred the same tiny gap.
Looks like (without using elaborate hacks) there isn't an simple(ish) way of achieving gapless playback, at least right now.
it is possible. you can check this out: http://evelyn-interactive.searchingforabby.com/ it's all done in html5. they are preloading all videos at the beginning and start them at the same time. didn t had time yet, to check how they re doing it exactly, but maybe it helps if you check their scripts via firebug
After many attempts, I did end up using something similar to Method 2. I found this site http://switchcam.com and basically copied their approach. I pre-buffered as the video start time approached and then auto-played as the videos starting point hit. I had the current videos playing simultaneously (in a little div - as a UI bonus) and users could toggle between the videos switching the "main screen view". Since all videos were playing at once, you could choose the audio and the gap didn't end up being an issue.
Unfortunately, I never ended up solving my problem exactly and I am not sure my solution has the best performance, but it works okay with a fast connection.
Thanks for everyones help!
I'm working on a HTML 5 game, it is already online, but it's currently small and everything is okay.
Thing is, as it grows, it's going to be loading many, many images, music, sound effects and more. After 15 minutes of playing the game, at least 100 different resources might have been loaded already. Since it's an HTML5 App, it never refreshes the page during the game, so they all stack in the background.
I've noticed that every resource I load - on WebKit at least, using the Web Inspector - remains there once I remove the <img>, the <link> to the CSS and else. I'm guessing it's still in memory, just not being used, right?
This would end up consuming a lot of RAM eventually, and lead to a downgrade in performance specially on iOS and Android mobiles (which I slightly notice already on the current version), whose resources are more limited than desktop computers.
My question is: Is it possible to fully unload a Resource, freeing space in the RAM, through JavaScript? Without having to refresh the whole page to "clean it".
Worst scenario: Would using frames help, by deleting a frame, to free those frames' resources?.
Thank you!
Your description implies you have fully removed all references to the resources. The behavior you are seeing, then, is simply the garbage collector not having been invoked to clean the space, which is common in javascript implementations until "necessary". Setting to null or calling delete will usually do no better.
As a common case, you can typically call CollectGarbage() during scene loads/unloads to force the collection process. This is typically the best solution when the data will be loaded for game "stages", as that is a time that is not time critical. You usually do not want the collector to invoke during gameplay unless it is not a very real-time game.
Frames are usually a difficult solution if you want to keep certain resources around for common game controls. You need to consider whether you are refreshing entire resources or just certain resources.
All you can do is rely on JavaScript's built in garbage collection mechanism.
This kicks in whenever there is no reference to your image.
So assuming you have a reference pointer for each image, if you use:
img.destroy()
or
img.parentNode.removeChild(img)
Worth checking out: http://www.ibm.com/developerworks/web/library/wa-memleak/
Also: Need help using this function to destroy an item on canvas using javascript
EDIT
Here is some code that allows you to load an image into a var.
<script language = "JavaScript">
var heavyImage = new Image();
heavyImage.src = "heavyimagefile.jpg";
......
heavyImage = null; // removes reference and frees up memory
</script>
This is better that using JQuery .load() becuase it gives you more control over image references, and they will be removed from memory if the reference is gone (null)
Taken from: http://www.techrepublic.com/article/preloading-and-the-javascript-image-object/5214317
Hope it helps!
There are 2 better ways to load images besides a normal <img> tag, which Google brilliantly discusses here:
http://www.youtube.com/watch?v=7pCh62wr6m0&list=UU_x5XG1OV2P6uZZ5FSM9Ttw&index=74
Loading the images in through an HTML5 <canvas> which is way way faster. I would really watch that video and implement these methods for more speed. I would imagine garbage collection with canvas would function better because it's breaking away from the DOM.
Embedded data urls, where the src attribute of an image tag is the actual binary data of the image (yeah it's a giant string). It starts like this: src="data:image/jpeg;base64,/9j/MASSIVE-STRING ... " After using this, you would of course want to use a method to remove this node as discussed in the other answers. (I don't know how to generate this base64 string, try Google or the video)
You said Worst scenario: Would using frames help, by deleting a frame, to free those frames' resources
It is good to use frame. Yes, it can free up resource by deleting the frames.
All right, so I've made my tests by loading 3 different HTML into an < article > tag. Each HTML had many, huge images. Somewhat about 15 huge images per "page".
So I used jQuery.load() function to insert each in the tag. Also had an extra HTML that only had an < h1 >, to see what happened when a page with no images was replacing the previous page.
Well, turns out the RAM goes bigger while you start scrolling, and shoots up when going through a particularly big image (big as in dimensions and size, not just size). But once you leave that behind and lighter images come to screen, the RAM consumption actually goes down. And whenever I replaced using JS the content of the page, the RAM consumption went really down when it was occupying to much. Virtual Memory remained always high and rarely went down.
So I guess the browser is quite smart about handling Resources. It does not seem to unload them if you leave it there for a long while, but as soon you start loading other pages or scrolling, it starts loading / freeing up.
I guess I don't have anything to worry about after all...
Thanks everyone! =)
I am pretty sure that I know the answer to this already, but I am interested to see if anyone has other ideas. We are working on a website to include a major redesign with mega menus. One of my top things in a redesign is to reduce the page download time as much as is possible. All my images, css and javascript are cached and that's good. However, the part that I am trying to work through is the html coding for the menu, and if there is a way to locally cache that within reason.
As a side note, I like to do things as pure CSS as possible (for SEO), and so that would include outputting the mega menus directly onto the HTML page. But at the same time, I know that if it takes a number of seconds for the page to download the html content at the top, well, then, we are probably going to be running some customers off there too. Maybe the best then would be to have JavaScript output the menus, but then you run into the couple of customers that don't have Javascript enabled.
Right now the pages are about 30K for the menus, and I anticipate that doubling and maybe more when we do the redesign.
Do you have some thoughts for this issue? What would you see as the best way to tackle this?
Thank you!!
JMax
Honestly, 30K for an element is nothing in this day and age, with high-speed connections common and browsers effectively caching as necessary. People don't leave because of a second or two. It's when you have Flash movies that preload or crazy auto-starting videos that people get annoyed in a hurry.
I've got a similar application with a menu that's likely double that right now...let me add, it's not by choice, it's something I inherited and have to maintain for the time being. The menu is output simply in an unordered list and then I use Superfish and CSS to do the styling as necessary. There's an initial hit, but after that, caching kicks in and we're good to go. Even as crazy as it is, the load isn't prohibitive. Navigating it, however, is a mess. I'd strongly recommend against confusing the heck out of your user with so many choices, especially on a mega menu that can be a UI hurdle for disabled and older users. When you boil it down, the whole basis behind the "Web 2.0 movement" (I hate that term) is to minimize or cloak complexity.
If you're REALLY concerned about performance, start off with what you're loading. Limit your Javascript by combining files, especially those small Jquery files that tend to stack up. HTTP requests can severely impact a site, especially since they monopolize the loads initially. Similarly, combine small CSS files and optimize the rest via an online tool To reduce image loads, create sprites for your graphics so you're loading one file instead of many. Here's a tut on Sprites and a simple google search will give you dozens of sites that will build the sprite and css automatically. Load anything you can from CDN, such as Jquery, Prototype, etc (hopefully only one framework per site, because two or more is unnecessary)
If you're still out of hand, look at your graphics one more time. Could you take advantage of pure CSS or image repeating via CSS to reduce loads further? Have you optimized all the graphics? Could you tweak the design to take advantage of those tricks?
After all that, if you simply can't change the menu to be more friendly, start investigating options. However, I suspect you'll find better gains in the first couple of steps than you would from taking extreme measures on the menu.
You could either set HTTP caching for a javascript code file that generates the menu, or use ajax to insert a pre-generated HTML menu from another file (again with a long expiry date set on cache).
Both those solutions require javascript through. I can't think of another way to remove the menus from the HTTP traffic apart from an IFRAME (yuck).
30k is massive for plain HTML though - do you REALLY need such a huge navigation structure?
For example, I want the page to play an audio file while at the same time have some bullets slide into view at just the right moment that said bullet is talked about in the audio file. A similar effect would also be used for closed captioning. When I say reliable I mean specifically that the timing will be consistent across many common platforms (browser/OS/CPU/etc) as well as consistent in different sessions on the same platform (they hit refresh, it works again just as it did before, etc).
NOTE: It's OK if the answer is 'NO', but please include at least a little quip about why that is.
Check out this animation, which synchronizes a 3D SVG effect to an audio file.
The technique is explained in a blog post at http://mrdoob.com/blog/page/3. Look for the one entitled "svg tag+audio tag = 3D waveform". The key is to create a table of volume values corresponding to the audio file.
You'll obviously have some work to do in studying this example and the Javascript it uses to adapt it to your scenario. And it will probably only work in browsers that support HTML5.
Given the current situation and HTML5 support, I would solve this using Flash.