Preload multiple (798) audio files in Chrome - javascript

I making a game and I want to load 798 sound files, but there is a problem only in Chrome, Firefox fine. Sample code: https://jsfiddle.net/76zb42ag/, see the console (press F12).
Sometimes script loads only 100, 500, 700 files, sometimes is fine. If i reduce the number of files to ex. 300 is ok (always). How can I solve this problem? I need a callback or any ideas? The game will be offline (node webkit).
Javascript code :
var total = 0;
// sample file to download: http://www.sample-videos.com/audio/mp3/crowd-cheering.mp3
// sounds.length = 798 files
var sounds = [
(...limit character, see https://jsfiddle.net/76zb42ag/...)
];
for (var i in sounds) {
load(sounds[i]);
}
function load(file) {
var snd = new Audio();
snd.addEventListener('canplaythrough', loadedAudio, false);
snd.src = file;
}
function loadedAudio() {
total++;
console.log(total);
if (total == sounds.length){
console.log("COMPLETE");
}
}

This isn't really a code problem. It's a general architecture problem.
Depending not only on the number, but also the size of the samples, it's going to be unlikely you can get them all loaded at once. Even if you can, it'll run very poorly because of the high memory use and likely crash the tab after a certain amount of time.
Since it's offline, I would say you could even get away with not pre-loading them at all, since the read speed is going to be nearly instantaneous.
If you find that isn't suitable, or you may need like 5 at once and it might be too slow, I'd say you'll need to architect your game in a way that you can determine which sounds you'll need for a certain game state, and just load those (and remove references to ones you don't need so they can be garbage collected).
This is exactly what all games do when they show you a loading screen, and for the same reasons.
If you want to avoid "loading screens", you can get clever by working out a way to know what is coming up and load it just ahead of time.

Related

chrome.runtime.reload blocking the extension

So, I'm developing a plugin for webpack for hot-reloading the chrome extensions.
The biggest problem is, if I call a certain number of times the "chrome.runtime.reload()" this will make Chrome block the extension:
This extension reloaded itself too frequently.
On a old discussion they said that if you call the reload more than 5 times in a 10 seconds time frame, your extension will be blocked.
The thing is, I've throttled the reloading a lot (like max 1 call each 5 seconds) and this still happening. I've searched a lot on the docs. but didn't found anything related to this, so I'm kind of in the dark.
So there's really a threshold for this or you only can call the runtime reload a fixed number of times before it blocks the extension?
UPDATE:
To deal with this problem, I've requested a new feature for the Chromium team, Let disable the "Fast Reload" blocking for unpacked extensions. If anyone have the same problems, please give a star on this feature request :)
When the threshold has been reached (i.e. reloaded 5 times in quick succession), you have to wait at least 10 seconds before the counter resets and the extension can safely be reloaded.
Source (trimmed code to emphasize the relevant logic):
std::pair<base::TimeTicks, int>& reload_info =
last_reload_time_[extension_id];
base::TimeTicks now = base::TimeTicks::Now();
if (reload_info.first.is_null() ||
(now - reload_info.first).InMilliseconds() > kFastReloadTime) {
reload_info.second = 0;
} else {
reload_info.second++;
}
// ....
reload_info.first = now;
ExtensionService* service =
ExtensionSystem::Get(browser_context_)->extension_service();
if (reload_info.second >= kFastReloadCount) {
// ....
base::ThreadTaskRunnerHandle::Get()->PostTask(
FROM_HERE, base::BindOnce(&ExtensionService::TerminateExtension,
service->AsWeakPtr(), extension_id));
extensions::WarningSet warnings;
warnings.insert(
extensions::Warning::CreateReloadTooFrequentWarning(
extension_id));
With kFastReloadTime and kFastReloadCount defined here:
// If an extension reloads itself within this many miliseconds of reloading
// itself, the reload is considered suspiciously fast.
const int kFastReloadTime = 10000;
// After this many suspiciously fast consecutive reloads, an extension will get
// disabled.
const int kFastReloadCount = 5;

JavaScript: Like-Counter with Memory

Complete - Edited Once
I am looking to create a Like Counter with persistent Memory!
Right now, my project is stored on a USB-Drive and I'm not thinking of uploading my semi-finished site to the Internet just yet. I'm carrying it around, plugging and working.
A feature of the site, is a Heart Counter and Like Counter, respective with their symbolic icons.
I have a little sideline JavaScript file that has a dozen functions to handle the click-events and such - such as the Number Count of the counters.
But, as the values of the counters are auto-assigned to Temporary Memory - if you were to reload the page - the counter number would reset to it's default, Zero. A huge headache...
Reading from .txt
I thought of using the experimental ReadFile() object to handle the problem - but I soon found that it needed a user-put file to operate (from my examinations).
Here's my attempt:
if (heartCount || likeCount >= 1) {
var reader = new FileReader();
var readerResults = reader.readAsText(heartsAndLikes.txt);
//return readerResults
alert(readerResults);
}
When loaded, the page runs through standard operations, except for the above.
This, in my opinion, would have been the ideal solution...
Reading from Cookies
Cookies now don't seem like an option as it resides on a per-computer basis.
They are stored on the computer's SSD, not in the JavaScript File... sad...
HTML5 Web Storage
Using the new Web Storage will be of big help, probably. But again, it is on a per-computer basis, no matter how beautiful the system is...
localStorage.heartCount = 0 //Originally...
function heartButtonClicked() {
if (localStorage.heartCount) {
localStorage.heartCount = Number(localStorage.heartCount) + 1
}
document.getElementById('heartCountDisplay').innerHTML = localStorage.heartCount
} //Function is tied to the heartCountButton directly via the 'onclick' method
However, I am questioning whether web storage can be carried over on a USB-Drive...
Summarised ideas
Currently, I am looking to Reading and Editing the files, as it's most ideal to my situation. But...
Which would you use? Would you introduce a new method of things?
Please, tell me about it! :)
if (typeof(Storage) !== "undefined") { //make sure local storage is available
if (!localStorage.heartCount) { //if heartCount is not set then set it to zero
localStorage.heartCount = 0;
}
} else {
alert('Local storage is not available');
}
function heartButtonClicked() {
if (localStorage.heartCount) { //if heartCount exists then increment it by one
localStorage.heartCount++;
}
//display the result
document.getElementById('heartCountDisplay').innerHTML = localStorage.heartCount
}
This will only work on a per computer basis and will not persist on your thumb drive. The only way I can think of to persist the data on your drive is to manually download a JSON or text file.

Get frame numbers in HTML5 Video

I am trying to capture each frame number of the video however it looks like there is no way of achieving it. So I started my own clock to match the frame numbers of the video but they never match and the difference keeps increasing as the video progress.
Please have a look at my bin. http://jsbin.com/dopuvo/4/edit
I have added the frame number to each frame of the video from Adobe After Effect so I have more accurate information of the difference. The Video is running at 29.97fps and the requestAnimationFrame is also set to increase the number at the same rate, however I am not sure where this difference is coming from.
Sometimes they match and sometimes they don't. I also tried doing it offline but I get the same results. Any help.
I found something on github for this. https://github.com/allensarkisyan/VideoFrame
I have implemented it in this fiddle: https://jsfiddle.net/k0y8tp2v/
var currentFrame = $('#currentFrame');
var video = VideoFrame({
id : 'video',
frameRate: 25,
callback : function(frame) {
currentFrame.html(frame);
}
});
$('#play-pause').click(function(){
if(video.video.paused){
video.video.play();
video.listen('frame');
$(this).html('Pause');
}else{
video.video.pause();
video.stopListen();
$(this).html('Play');
}
});
EDIT: updated fiddle to new video so it works again.
EDIT: As pointed out, the video is 25fps, so I updated it, and while I was there removed reliance on jQuery.
Non jQuery version:
https://jsfiddle.net/k0y8tp2v/1/
var currentFrame = document.getElementById('currentFrame');
var video = VideoFrame({
id : 'video',
frameRate: 25,
callback : function(frame) {
currentFrame.innerHTML = frame ;
}
});
document.getElementById('play-pause').addEventListener('click', function(e){
if(video.video.paused){
video.video.play();
video.listen('frame');
e.target.innerHTML = 'Pause';
}else{
video.video.pause();
video.stopListen();
e.target.innerHTML = 'Play';
}
});
The problem is that setTimeout is not really predictable, so you can't be sure that exactly one new frame has been displayed every time your function runs. You need to check the currentTime of the video every time you update your frame display and multiply that by the frame rate.
Here's a working example: http://jsbin.com/xarekice/1/edit It's off by one frame, but it looks like you may have two frames at the beginning marked "000000".
A few things about the video element that you may want to be aware of:
As you seem to have discovered, there's no reliable way to determine the frame rate, so you have to discover it elsewhere and hard-code it. There are some interesting things going on with video metrics, but they're non-standard, not widely supported and, in my experience, completely ineffective at determining the actual frame rate.
The currentTime is not always exactly representative of what's on the screen. Sometimes it's ahead a little bit, especially when seeking (which is why in my JSBin, I don't update the frame while you're seeking).
I believe currentTime updates on a separate thread from the actual video draw, so it kind of works like it's own clock that just keeps going. It's where the video wants to be, not necessarily where it is. So you can get close, but you need to round the results of the frame calculation, and once in a while, you may be off by one frame.
Starting in M83, Chromium has a requestVideoFrameCallback() API, which might solve your issue.
You can use the mediaTime to get a consistent timestamp, as outlined in this Github issue. From there, you could try something like this:
var frameCounter = (time, metadata) => {
let count = metadata.mediaTime * frameRate;
console.log("Got frame: " + Math.round(count));
// Capture code here.
video.requestVideoFrameCallback(frameCounter);
}
video.requestVideoFrameCallback(frameCounter)
This will only fire on new frames, but you may occasionally miss one (which you can detect from a discontinuity in the metadata.presentedFrames count). You might also be slightly late in capturing the frame (usually 16ms, or one call to window.requestAnimationFrame() later than when the video frame is available).
If you're interested in a high level overview of the API, here's a blogpost, or you can take a look at the API's offical github.

Making DiveIntoPython3 work in IE8 (fixing a Javascript performance issue)

I am trying to fix the performance problem with Dive Into Python 3 on IE8. Visit this page in IE8 and, after a few moments, you will see the following popup:
alt text http://dl.getdropbox.com/u/87045/permalinks/dip3-ie8-perf.png
I traced down the culprit down to this line in j/dip3.js
... find("tr:nth-child(" + (i+1) + ") td:nth-child(2)");
If I disable it (and return from the function immediately), the "Stop executing this script?" dialog does not appear as the page now loads fairly fast.
I am no Javascript/jquery expert, so I ask you fellow developers as to why this query is making IE slow. Is there a fix for it?
Edit: you can download the entire webpage (980K) for local viewing/editing.
This seems to need a bit of rewriting.
nth-child is a slow operation. You should implement the current functionality by generating classes or ids that would be common for the TDs in table and elements from the refs collection (dip3.js line 183). and then:
refs.each(function(i) {
var a = $(this);
var li = a.parents("pre").next("table").find("td."+a.attr('class'));
li.add(a).hover(function() { a.css(hip); li.css(hip); },
function() { a.css(unhip); li.css(unhip); });
});
This popup message is misleading - it doesn't actually mean that IE is running slowly, but that the number of executed script statements has exceeded a certain threshold. Even if the script executes very quickly, you'll still see this message if you go over the limit. The only way to get rid of it is to reduce the number of statements executed or edit the registry.
http://support.microsoft.com/kb/175500
I find Microsoft's implementation of this very annoying. It makes assumptions about the speed of your computer.

The definitive best way to preload images using JavaScript/jQuery?

I'm fully aware that this question has been asked and answered everywhere, both on SO and off. However, every time there seems to be a different answer, e.g. this, this and that.
I don't care whether it's using jQuery or not - what's important is that it works, and is cross-browser.]
So, what is the best way to preload images?
Unfortunately, that depends on your purpose.
If you plan to use the images for purposes of style, your best bet is to use sprites.
http://www.alistapart.com/articles/sprites2
However, if you plan to use the images in <img> tags, then you'll want to pre-load them with
function preload(sources)
{
var images = [];
for (i = 0, length = sources.length; i < length; ++i) {
images[i] = new Image();
images[i].src = sources[i];
}
}
(modified source taken from What is the best way to preload multiple images in JavaScript?)
using new Image() does not involve the expense of using DOM methods but a new request for the image specified will be added to the queue. As the image is, at this point, not actually added to the page, there is no re-rendering involved. I would recommend, however, adding this to the end of your page (as all of your scripts should be, when possible) to prevent it from holding up more critical elements.
Edit: Edited to reflect comment quite correctly pointing out that separate Image objects are required to work properly. Thanks, and my bad for not checking it more closely.
Edit2: edited to make the reusability more obvious
Edit 3 (3 years later):
Due to changes in how browsers handle non-visible images (display:none or, as in this answer, never appended to the document) a new approach to pre-loading is preferred.
You can use an Ajax request to force early retrieval of images. Using jQuery, for example:
jQuery.get(source);
Or in the context of our previous example, you could do:
function preload(sources)
{
jQuery.each(sources, function(i,source) { jQuery.get(source); });
}
Note that this doesn't apply to the case of sprites which are fine as-is. This is just for things like photo galleries or sliders/carousels with images where the images aren't loading because they are not visible initially.
Also note that this method does not work for IE (ajax is normally not used to retrieve image data).
Spriting
As others have mentioned, spriting works quite well for a variety of reasons, however, it's not as good as its made out to be.
On the upside, you end up making only one HTTP request for your images. YMMV though.
On the down side you are loading everything in one HTTP request. Since most current browsers are limited to 2 concurrent connections the image request can block other requests. Hence YMMV and something like your menu background might not render for a bit.
Multiple images share the same color palette so there is some saving but this is not always the case and even so it's negligible.
Compression is improved because there is more shared data between images.
Dealing with irregular shapes is tricky though. Combining all new images into the new one is another annoyance.
Low jack approach using <img> tags
If you are looking for the most definitive solution then you should go with the low-jack approach which I still prefer. Create <img> links to the images at the end of your document and set the width and height to 1x1 pixel and additionally put them in a hidden div. If they are at the end of the page, they will be loaded after other content.
As of January 2013 none of the methods described here worked for me, so here's what did instead, tested and working with Chrome 25 and Firefox 18. Uses jQuery and this plugin to work around the load event quirks:
function preload(sources, callback) {
if(sources.length) {
var preloaderDiv = $('<div style="display: none;"></div>').prependTo(document.body);
$.each(sources, function(i,source) {
$("<img/>").attr("src", source).appendTo(preloaderDiv);
if(i == (sources.length-1)) {
$(preloaderDiv).imagesLoaded(function() {
$(this).remove();
if(callback) callback();
});
}
});
} else {
if(callback) callback();
}
}
Usage:
preload(['/img/a.png', '/img/b.png', '/img/c.png'], function() {
console.log("done");
});
Note that you'll get mixed results if the cache is disabled, which it is by default on Chrome when the developer tools are open, so keep that in mind.
In my opinion, using Multipart XMLHttpRequest introduced by some libraries will be a preferred solution in the following years. However IE < v8, still don't support data:uri (even IE8 has limited support, allowing up to 32kb). Here is an implementation of parallel image preloading - http://code.google.com/p/core-framework/wiki/ImagePreloading , it's bundled in framework but still worth taking a look.
This was from a long time ago so I dont know how many people are still interested in preloading an image.
My solution was even more simple.
I just used CSS.
#hidden_preload {
height: 1px;
left: -20000px;
position: absolute;
top: -20000px;
width: 1px;
}
Here goes my simple solution with a fade in on the image after it is loaded.
function preloadImage(_imgUrl, _container){
var image = new Image();
image.src = _imgUrl;
image.onload = function(){
$(_container).fadeTo(500, 1);
};
}
For my use case I had a carousel with full screen images that I wanted to preload. However since the images display in order, and could take a few seconds each to load, it's important that I load them in order, sequentially.
For this I used the async library's waterfall() method (https://github.com/caolan/async#waterfall)
// Preload all images in the carousel in order.
image_preload_array = [];
$('div.carousel-image').each(function(){
var url = $(this).data('image-url');
image_preload_array.push(function(callback) {
var $img = $('<img/>')
$img.load(function() {
callback(null);
})[0].src = url;
});
});
async.waterfall(image_preload_array);
This works by creating an array of functions, each function is passed the parameter callback() which it needs to execute in order to call the next function in the array. The first parameter of callback() is an error message, which will exit the sequence if a non-null value is provided, so we pass null each time.
See this:
http://www.mattfarina.com/2007/02/01/preloading_images_with_jquery
Related question on SO:
jquery hidden preload

Categories

Resources