how to improve the performance o my js game - javascript

I have started learning javascript a couple of days ago and done the codeacadmey stuff and thought i will try make a simple game.
so i came up with the memory game where you have to find pairs of images.
it is all working and i got a score system in place but a few people have said the delay that happens once the cards have been chosen to allowing another chocie is hindering them and i cant figure out how to improve that performance.
here is a bit of code i think is causing the delay, is there any better way to produce the same result, sorry about before i am new to all this.
function check() {
clearInterval(tid);
if(people[secondchocie] === people[firstchocie]) {
cntr++;
(cntr === numOfMatches) {
stop();
score = checkScore(amountGoes);
$('#gameFinished').append('<p>Well done, you managed to complete the game your score is <span>' + score + '</span></p>');
}
turns = 0;
return;
} else {
document.images[firstchocie + numOfImages].src = backcard;
document.images[secondchocie + numOfImages].src = backcard;
turns = 0;
return;
}
}

I can't create comments, so I'll put this in an answer.
Although I agree with lukas.pukenis ...
Changing images can take some time if they aren't preloaded. To test this: Try to get them into the browser cache by adding them somewhere else in the page (i.e. with an IMG tag) before starting the game.
Then you'll be sure they are in the cache.
edit:
I recently used this:
var cache = [];
function preLoadImages(arrImg)
{
var args_len = arrImg.length;
for (var i = args_len; i--;)
{
var cacheImage = document.createElement('img');
cacheImage.src = arrImg[i];
cache.push(cacheImage);
}
}
preLoadImages(['images/img1.png','images/img2.png','images/img3.png',]);
you can add all images needed to the javascript array.
If your a quick study :) you can do the following:
If your page is generated by php you could let php read the entire images directory and write the filenames in the page as javascript code.
Or you could create an ajax request wich returns all paths to the images and sends them to the preload function as a callback.

Related

Vimeo video, progress and disable fast forward

I am working on a site used for mandatory instruction. We have to make sure the student watches the video and doesn't fast forward. I would also like to remember the progress the student made in the video in case they need to leave then return later to complete watching.
I have used this JS to remove the ability to fast forward. I'm just not sure how to get the code to remember the progress, then start at that point if the student returns later.
var iframe = document.querySelector("iframe");
var player = new Vimeo.Player(iframe);
var timeWatched = 0;
player.on("timeupdate", function(data) {
if (data.seconds - 1 < timeWatched && data.seconds > timeWatched) {
timeWatched = data.seconds;
}
});
player.on("seeked", function(data) {
if (timeWatched < data.seconds) {
player.setCurrentTime(timeWatched);
}
});
Thanks for any help on this.
you can store the current time in database for future use and then pass to the js whenever user views the video

Why/How is my code causing a memory leak?

I have written the following JavaScipt code within a Spotfire TextArea. I include the application and tag for completeness, but I don't believe my issue is Spotfire-specific. Essentially, I have a timer which runs every 5 minutes, and clicks on a link (clickLink('Foo');) to trigger execution of some Python code elsewhere in the application. If the application also contains a timestamp of the last full update, which occurs every 30 minutes in the same manner (clickLink('Foo');):
function reportTimestamp() {
var timeNow = new Date();
var hoursNow = timeNow.getHours();
var minutesNow = timeNow.getMinutes();
var secondsNow = timeNow.getSeconds();
return hoursNow + ":" + minutesNow + ":" + secondsNow;
};
function timeBasedReaction(timestampAge){
if (timestampAge >= 1800) {
clickLink('Foo');
clickLink('Bar');
} else if (timestampAge >= 300) {
clickLink('Foo');
};
};
/*
function timeBasedReaction_B(timestampAge){
if (timestampAge >= 300) {
clickLink('Foo');
if (timestampAge >= 1800) {
clickLink('Bar');
};
};
};
*/
function clickLink(linkName) {
var clickTarget = document.getElementById(linkName).children[0];
clickTarget.click();
};
function checkTimestampAge() {
console.log(reportTimestamp());
var myTimeStamp = document.getElementById('Timestamp').children[0]
var timeStampMS = new Date(myTimeStamp.textContent).getTime();
var currentDateMS = new Date().getTime();
var timestampAgeSeconds = (currentDateMS - timeStampMS)/1000;
timeBasedReaction(timestampAgeSeconds);
};
function pageInitialization() {
checkTimestampAge();
var myTimer = null;
var timerInterval = 300000;
myTimer = setInterval(function(){checkTimestampAge()},timerInterval);
}
pageInitialization();
For reasons unclear to me, running this code in the application or in a web browser starts off fine, but eventually leads to very large memory allocation.
I've tried to read
4 Types of Memory Leaks in JavaScript and How to Get Rid Of Them,
JS setInterval/setTimeout Tutorial, and
An interesting kind of JavaScript memory leak, and it's a start, but I don't know enough to really understand what I'm doing wrong and how to correct it.
Thanks, and sorry for the huge block of text.
This causes a memory leak because of how Spotfire handles Javascript which has been associated with/loaded into a TextArea.
Both in the desktop client, as well as in the Webplayer instance, when the page is loaded, all the portions of that page are loaded, include the TextArea and including the Javascript associated therein. My previous understanding in the comments above:
"the code is intended to run when the page loads, and it was my understanding that it would stop/be cleared if the page was re-loaded or someone navigated away from it"
was incorrect. One of the script's actions was to update/redraw the HTML location in the TextArea. This, in turn, reloads the TextArea but does not clear the existing Javascript code. However, it's not really accessible anymore, either, since var myTimer = null actually creates a new myTimer rather than nulling-out the existing one. In this way, instances of myTimer increase geometrically as instances of function timeBasedReaction run and continually update the underlying TextArea and load in more of the same Javascript.
To anyone who ha a similar issue and come here, it's been over 3 months and I haven't figured out how to solve this once and for all. If I do, I'll try to come back with another update.

How to structure my code to return a callback?

So I've been stuck on this for quite a while. I asked a similar question here: How exactly does done() work and how can I loop executions inside done()?
but I guess my problem has changed a bit.
So the thing is, I'm loading a lot of streams and it's taking a while to process them all. So to make up for that, I want to at least load the streams that have already been processed onto my webpage, and continue processing stream of tweets at the same time.
loadTweets: function(username) {
$.ajax({
url: '/api/1.0/tweetsForUsername.php?username=' + username
}).done(function (data) {
var json = jQuery.parseJSON(data);
var jsonTweets = json['tweets'];
$.Mustache.load('/mustaches.php', function() {
for (var i = 0; i < jsonTweets.length; i++) {
var tweet = jsonTweets[i];
var optional_id = '_user_tweets';
$('#all-user-tweets').mustache('tweets_tweet', { tweet: tweet, optional_id: optional_id });
configureTweetSentiment(tweet);
configureTweetView(tweet);
}
});
});
}};
}
This is pretty much the structure to my code right now. I guess the problem is the for loop, because nothing will display until the for loop is done. So I have two questions.
How can I get the stream of tweets to display on my website as they're processed?
How can I make sure the Mustache.load() is only executed once while doing this?
The problem is that the UI manipulation and JS operations all run in the same thread. So to solve this problem you should just use a setTimeout function so that the JS operations are queued at the end of all UI operations. You can also pass a parameter for the timeinterval (around 4 ms) so that browsers with a slower JS engine can also perform smoothly.
...
var i = 0;
var timer = setInterval(function() {
var tweet = jsonTweets[i++];
var optional_id = '_user_tweets';
$('#all-user-tweets').mustache('tweets_tweet', {
tweet: tweet,
optional_id: optional_id
});
configureTweetSentiment(tweet);
configureTweetView(tweet);
if(i === jsonTweets.length){
clearInterval(timer);
}
}, 4); //Interval between loading tweets
...
NOTE
The solution is based on the following assumptions -
You are manipulating the dom with the configureTweetSentiment and the configureTweetView methods.
Ideally the solution provided above would not be the best solution. Instead you should create all html elements first in javascript only and at the end append the final html string to a div. You would see a drastic change in performance (Seriously!)
You don't want to use web workers because they are not supported in old browsers. If that's not the case and you are not manipulating the dom with the configure methods then web workers are the way to go for data intensive operations.

setInterval, animating multiple images, stopping setInterval

I need to create some animation using DHTML/Javascript and I am struggling to get anything I do to work. My parameters are it must use the setInterval function, a user defined function and 8 jpg files. There must also be a way of stopping the animation.
If someone could get me pointed in the right direction I would be very happy. I have not been able to find suitable information on how to do this so far and I am still fairly new to Javascript. Thanks.
Sorry for not posting code earlier. It was such a mess I didn't want to embarrass myself. Here's what I have. It's not working.
var slideShow = ['images/pic0.jpg','images/pic1.jpg','images/pic2.jpg','images/pic3.jpg','images/pic4.jpg','images/pic5.jpg','images/pic6.jpg','images/pic7.jpg'];
picO = new Array();
for(i=0; i < slideShow.length; i++) {
picO[i] = new Image();
picO[i].src = slideShow[i];
}
var curPic = -1;
function changeImage(){
curPic = (++curPic > slideShow.length-1)? 0 : curPic;
imgO.src = picO[curPic].src;
setInterval(changeImage,100);
}
window.onload=function(){
imgO = document.getElementById("imgAnim");
changeImage();
}
var t=setTimeout(function(){alert("Welcome to my animated page")},3000)
The other thing it needs to do is pop up an alert box 3 seconds after the animation starts. It's doing that but it's popping up EVERY 3 seconds, which is not what I want.
Thanks for your help so far. I've done pretty well with my Javascript work lately but this one is just something I'm not that familiar with.
To the using of setInterval and stopping the animation
var animation = setInterval(yourAnimation,500); // run yourAnimation every 500ms
clearInterval(animation); // stop animation
To the animation
var slideShow = ["img1.jpg","img2.jpg", ... ,"img8.jpg"];
var counter = 0;
function yourAnimation() {someImage.src = slideShow[++counter%slideShow.length];}

How to stop intense Javascript loop from freezing the browser

I'm using Javascript to parse an XML file with about 3,500 elements. I'm using a jQuery "each" function, but I could use any form of loop.
The problem is that the browser freezes for a few seconds while the loop executes. What's the best way to stop freezing the browser without slowing the code down too much?
$(xmlDoc).find("Object").each(function() {
//Processing here
});
I would ditch the "each" function in favour of a for loop since it is faster. I would also add some waits using the "setTimeout" but only every so often and only if needed. You don't want to wait for 5ms each time because then processing 3500 records would take approx 17.5 seconds.
Below is an example using a for loop that processes 100 records (you can tweak that) at 5 ms intervals which gives a 175 ms overhead.
var xmlElements = $(xmlDoc).find('Object');
var length = xmlElements.length;
var index = 0;
var process = function() {
for (; index < length; index++) {
var toProcess = xmlElements[index];
// Perform xml processing
if (index + 1 < length && index % 100 == 0) {
setTimeout(process, 5);
}
}
};
process();
I would also benchmark the different parts of the xml processing to see if there is a bottleneck somewhere that may be fixed. You can benchmark in firefox using firebug's profiler and by writing out to the console like this:
// start benchmark
var t = new Date();
// some xml processing
console.log("Time to process: " + new Date() - t + "ms");
Hope this helps.
Set a timeOut between processing to prevent the loop cycle from eating up all the browser resources. In total it would only take a few seconds to process and loop through everything, not unreasonable for 3,500 elements.
var xmlElements = $(xmlDoc).find('Object');
var processing = function() {
var element = xmlElements.shift();
//process element;
if (xmlElements.length > 0) {
setTimeout(processing, 5);
}
}
processing();
I'd consider converting the 3500 elements from xml to JSON serverside or even better upload it to server converted, so that it's native to JS from the getgo.
This would minimize your load and prolly make the file size smaller too.
you can setTimeout() with duration of ZERO and it will yield as desired
Long loops without freezing the browser is possible with the Turboid framework. With it, you can write code like:
loop(function(){
// Do something...
}, number_of_iterations, number_of_milliseconds);
More details in this turboid.net article: Real loops in Javascript
Javascript is single-threaded, so aside from setTimeout, there's not much you can do. If using Google Gears is an option for your site, they provide the ability to run javascript in a true background thread.
You could use the HTML5 workers API, but that will only work on Firefox 3.1 and Safari 4 betas atm.
I had the same problem which was happening when user refreshed the page successively. The reason was two nested for loops which happened more than 52000 times. This problem was harsher in Firefox 24 than in Chrome 29 since Firefox would crash sooner (around 2000 ms sooner than Chrome). What I simply did and it worked was that I user "for" loops instead of each and then I refactored the code so that I divided the whole loop array to 4 separated calls and then merged the result into one. This solution has proven that it has worked.
Something like this:
var entittiesToLoop = ["..."]; // Mainly a big array
loopForSubset(0, firstInterval);
loopForSubset(firstInterval, secondInterval);
...
var loopForSubset = function (startIndex, endIndex) {
for (var i=startIndex; i < endIndex; i++) {
//Do your stuff as usual here
}
}
The other solution which also worked for me was the same solution implemented with Worker APIs from HTML5. Use the same concept in workers as they avoid your browser to be frozen because they run in the background of your main thread. If just applying this with Workers API did not work, place each of instances of loopForSubset in different workers and merge the result inside the main caller of Worker.
I mean this might not be perfect but this has worked. I can help with more real code chunks, if someone still thinks this might suite them.
You could try shortening the code by
$(xmlDoc).find("Object").each(function(arg1) {
(function(arg1_received) {
setTimeout(function(arg1_received_reached) {
//your stuff with the arg1_received_reached goes here
}(arg1_received), 0)
})(arg1)
}(this));
This won't harm you much ;)
As a modification of #tj111 answer the full usable code
//add pop and shift functions to jQuery library. put in somewhere in your code.
//pop function is now used here but you can use it in other parts of your code.
(function( $ ) {
$.fn.pop = function() {
var top = this.get(-1);
this.splice(this.length-1,1);
return top;
};
$.fn.shift = function() {
var bottom = this.get(0);
this.splice(0,1);
return bottom;
};
})( jQuery );
//the core of the code:
var $div = $('body').find('div');//.each();
var s= $div.length;
var mIndex = 0;
var process = function() {
var $div = $div.first();
//here your own code.
//progress bar:
mIndex++;
// e.g.: progressBar(mIndex/s*100.,$pb0);
//start new iteration.
$div.shift();
if($div.size()>0){
setTimeout(process, 5);
} else {
//when calculations are finished.
console.log('finished');
}
}
process();

Categories

Resources