Storing execution state in JavaScript? Can you resume later? - javascript

Is there anyway to do this? I want to create a JavaScript application that can "resume" application from the last checkpoint e.g.
//code.js
var abc = 13;
checkpoint("myLocalStorage");
alert(abc);
the checkpoint function will store every info about the execution such that in the future execution can be resumed right where it was left of as such:
//resume.js
resume("myLocalStorage");
This would be very helpful for executing long script / script with huge loops - I'm not talking about executing some tiny script that preloads images or do some funny animations. I'm talking about using JavaScript as a real number crunching tool where execution can take a long time and demands huge computing power. In these context you can see how useful execution checkpointing could be!
I suppose such thing doesn't exist for JavaScript yet, but if anyone ever comes close to something that remotely resembles it I would still be very grateful.

In order to make something that "suspend-able" in Javascript, you need to formulate things a little differently than you would in normal program.
Step 1
Decide how much of the problem you are able to do in one pass, for lack of a better word.
Step 2
Store the state in some kind of object. You have no intermediate values, just exactly what is needed to make the next pass
Step 3
Write the code so it can run with a window.setTimeout() function. This makes testing much easier than reloading the page.
In this case, I have a program that converts my whole name to lower case, one step at a time. The only data I need to save is my name, and an index of where along the calculations I am.
Example 1: Uses setTimeout()
<html>
<head>
<title>Test Thingy</title>
</head>
<body>
<script>
var data = {
name: ["Jeremy", "J", "Starcher"],
idx: 0
}
function doPass() {
// If at the end of the list
if (data.idx >= data.name.length) {
alert("All iterations done:" + data.name.join(" "));
return;
}
// Do our calculation here
var s = data.name[data.idx];
s = s.toLowerCase();
data.name[data.idx] = s;
data.idx++;
window.setTimeout(doPass);
}
doPass();
</script>
</body>
</html>
Example 2: Uses localStorage. Hit 'reload' 4 times to test
<html>
<head>
<title>Test Thingy</title>
</head>
<body>
<script>
var data;
data = localStorage.getItem("data");
if (data) {
data = JSON.parse(data);
} else {
data = {
name: ["Jeremy", "J", "Starcher"],
idx: 0
}
}
function doPass() {
// If at the end of the list
if (data.idx >= data.name.length) {
alert("All iterations done:" + data.name.join(" "));
return;
}
// Do our calculation here
var s = data.name[data.idx];
alert(s);
s = s.toLowerCase();
data.name[data.idx] = s;
data.idx++;
localStorage.setItem("data", JSON.stringify(data));
}
doPass();
</script>
</body>
</html>

Javascript is not designed for "[storage on] real number crunching tool where execution can take a long time and demands huge computing power". Here's the best you'll get: http://www.w3schools.com/html/html5_webstorage.asp

The latest browsers support yield. We can look into it.
https://developer.mozilla.org/en-US/docs/JavaScript/New_in_JavaScript/1.7

Related

How can I prevent reloading of javascript in camel?

I am a starter for camel.
I used javascript for validation logic implement in camel xml.
Initially, it takes some time to load javascript when the first event(a file with some records) comes in. This situation is find.
In this case, only the first record is slow because of loading time of javascript and the rest of the records is normally performed.
The problem is that next event(a file) is coming in.
Camel tries to load javascript again. So, it takes loading time to process each file, so the overall performance has been degraded.
I want to modify some logic so that camel can only load it once.
How can I solve this problem?
<unmarshal id="_FileParsing">
<bindy
classType="com.openmzn.ktds.dao.volte.input.VoLTEBody"
locale="korea" type="Fixed"/>
</unmarshal>
<to id="_validateParsing" uri="language:javascript:classpath:spring/rules/volte/volte.js"/>
<multicast id="_FileDistributor" parallelProcessing="false">
<toD id="_ProcessNRat" uri="direct:NRAT"/>
<toD id="_ProcessDrop" uri="direct:DROP"/>
</multicast>
Javascript File
var bodyList = exchange.in.getBody(ArrayList.class);
if(!CollectionUtils.isEmpty(bodyList)) {
for (total_count = 0; total_count < bodyList.size(); total_count++) {
uBody = bodyList[total_count];
enriched = enrich(uBody);
result = validate(enriched);
resultList.add(result);
...
}
function enrich(uBody) {
...
}
function validate(enriched) {
...
}
You can turn on cacheScript=true, see the docs
https://github.com/apache/camel/blob/master/docs/components/modules/ROOT/pages/language-component.adoc

How to structure my code to return a callback?

So I've been stuck on this for quite a while. I asked a similar question here: How exactly does done() work and how can I loop executions inside done()?
but I guess my problem has changed a bit.
So the thing is, I'm loading a lot of streams and it's taking a while to process them all. So to make up for that, I want to at least load the streams that have already been processed onto my webpage, and continue processing stream of tweets at the same time.
loadTweets: function(username) {
$.ajax({
url: '/api/1.0/tweetsForUsername.php?username=' + username
}).done(function (data) {
var json = jQuery.parseJSON(data);
var jsonTweets = json['tweets'];
$.Mustache.load('/mustaches.php', function() {
for (var i = 0; i < jsonTweets.length; i++) {
var tweet = jsonTweets[i];
var optional_id = '_user_tweets';
$('#all-user-tweets').mustache('tweets_tweet', { tweet: tweet, optional_id: optional_id });
configureTweetSentiment(tweet);
configureTweetView(tweet);
}
});
});
}};
}
This is pretty much the structure to my code right now. I guess the problem is the for loop, because nothing will display until the for loop is done. So I have two questions.
How can I get the stream of tweets to display on my website as they're processed?
How can I make sure the Mustache.load() is only executed once while doing this?
The problem is that the UI manipulation and JS operations all run in the same thread. So to solve this problem you should just use a setTimeout function so that the JS operations are queued at the end of all UI operations. You can also pass a parameter for the timeinterval (around 4 ms) so that browsers with a slower JS engine can also perform smoothly.
...
var i = 0;
var timer = setInterval(function() {
var tweet = jsonTweets[i++];
var optional_id = '_user_tweets';
$('#all-user-tweets').mustache('tweets_tweet', {
tweet: tweet,
optional_id: optional_id
});
configureTweetSentiment(tweet);
configureTweetView(tweet);
if(i === jsonTweets.length){
clearInterval(timer);
}
}, 4); //Interval between loading tweets
...
NOTE
The solution is based on the following assumptions -
You are manipulating the dom with the configureTweetSentiment and the configureTweetView methods.
Ideally the solution provided above would not be the best solution. Instead you should create all html elements first in javascript only and at the end append the final html string to a div. You would see a drastic change in performance (Seriously!)
You don't want to use web workers because they are not supported in old browsers. If that's not the case and you are not manipulating the dom with the configure methods then web workers are the way to go for data intensive operations.

How can I update the DOM while running intense Javascript?

I'm writing Javascript which is counting up to a certain number for a project. The number could be around 100,000 and It will take roughly 10-15 seconds to complete processing.
I want the script to run as soon as the user calls the page and when the script completes it does a redirect.
Is it possible to pause for even 10ms to update the DOM while it is running to give feedback such as "Still working"?
I would like to avoid the use of jQuery and web-workers are not an option in this situation.
I realise this isn't a real world application!
EDIT: Added some of the code as a sample:
In the head
function myCounter (target) {
var t = target;
var count = 0;
while (t != count){
if (t == count) {
window.location.replace("http://example.com"); // redirect
}
count++;
}
}
In the body
<script>myCounter(100000);</script>
In most browsers JavaScript and the UI run in the same thread. You can give the thread back to the browser by using setTimeout (or setInterval).
var myNumber = 0;
updateNumber();
function updateNumber(){
// do heavy work for great good, ideally divided into smaller chunks
document.getElementById('updateDiv').innerHTML = 'still working, up to ' + myNumber;
if(myNumber < limit) {
setTimeout(updateNumber, 20);
}
}
For a lot more details on the general process, this answer is worth a read: https://stackoverflow.com/a/4575011/194940

Javascript: Suppress "Stop running this script?", or how to use setTimeout?

I'm building a js library that reads binary files, including zip files.
Because there's no direct native support for arrays of binary data, when the zip files get large, there's a lot of copying that has to go on (See my other question for details).
This results in a "Stop Running this script?" alert. Now, I know this can happen if there's an infinite loop, but in my case, it's not an infinite loop. It's not a bug in the program. It just takes a loooooong time.
How can I suppress this?
This message is for security reason enabled, because otherwise you could simply block the users browser with a simple never ending loop. I think there no way to deactivate it.
Think about splitting you processing into serval parts, and schedule them via setTimeout, this should surpress the message, because the script is now not running all the time.
You could divide the process into increments, then use setTimeout to add a small delay.
In IE (and maybe Firefox too), the message is based on the number of statements executed, not the running time. If you can split some of the processing into a separate function, and defer it with setTimeout, I believe that it won't count toward the limit.
...answering my own question, so I could post the code I used.
The main issue was, I was reading the entire contents of a file, with a readToEnd() method, which actually reads one byte at a time. When reading a large file, it took a looooong time. The solution was to read asynchronously, and in batches.
This is the code I used:
readToEndAsync : function(callback) {
_state = "";
var slarge = "";
var s = "";
var txtrdr = this;
var readBatchAsync = function() {
var c = 0;
var ch = txtrdr.readChar();
while(ch != null)
{
s += ch;c++;
if(c > 1024)
{
slarge += s;
s = "";
break;
}
ch = txtrdr.readChar();
}
if (ch!=null){
setTimeout(readBatchAsync, 2);
}
else {
callback(slarge+s);
}
};
// kickoff
readBatchAsync();
return null;
},
And to call it:
textReader.readToEndAsync(function(out){
say("The callback is complete");
// the content is in "out"
});
I believe this feature is specific to Firefox and/or other browsers, and it has nothing to do with the javascript language itself.
As far as I know you (the programmer) have no way of stopping it in your visitors' browser.

How to stop intense Javascript loop from freezing the browser

I'm using Javascript to parse an XML file with about 3,500 elements. I'm using a jQuery "each" function, but I could use any form of loop.
The problem is that the browser freezes for a few seconds while the loop executes. What's the best way to stop freezing the browser without slowing the code down too much?
$(xmlDoc).find("Object").each(function() {
//Processing here
});
I would ditch the "each" function in favour of a for loop since it is faster. I would also add some waits using the "setTimeout" but only every so often and only if needed. You don't want to wait for 5ms each time because then processing 3500 records would take approx 17.5 seconds.
Below is an example using a for loop that processes 100 records (you can tweak that) at 5 ms intervals which gives a 175 ms overhead.
var xmlElements = $(xmlDoc).find('Object');
var length = xmlElements.length;
var index = 0;
var process = function() {
for (; index < length; index++) {
var toProcess = xmlElements[index];
// Perform xml processing
if (index + 1 < length && index % 100 == 0) {
setTimeout(process, 5);
}
}
};
process();
I would also benchmark the different parts of the xml processing to see if there is a bottleneck somewhere that may be fixed. You can benchmark in firefox using firebug's profiler and by writing out to the console like this:
// start benchmark
var t = new Date();
// some xml processing
console.log("Time to process: " + new Date() - t + "ms");
Hope this helps.
Set a timeOut between processing to prevent the loop cycle from eating up all the browser resources. In total it would only take a few seconds to process and loop through everything, not unreasonable for 3,500 elements.
var xmlElements = $(xmlDoc).find('Object');
var processing = function() {
var element = xmlElements.shift();
//process element;
if (xmlElements.length > 0) {
setTimeout(processing, 5);
}
}
processing();
I'd consider converting the 3500 elements from xml to JSON serverside or even better upload it to server converted, so that it's native to JS from the getgo.
This would minimize your load and prolly make the file size smaller too.
you can setTimeout() with duration of ZERO and it will yield as desired
Long loops without freezing the browser is possible with the Turboid framework. With it, you can write code like:
loop(function(){
// Do something...
}, number_of_iterations, number_of_milliseconds);
More details in this turboid.net article: Real loops in Javascript
Javascript is single-threaded, so aside from setTimeout, there's not much you can do. If using Google Gears is an option for your site, they provide the ability to run javascript in a true background thread.
You could use the HTML5 workers API, but that will only work on Firefox 3.1 and Safari 4 betas atm.
I had the same problem which was happening when user refreshed the page successively. The reason was two nested for loops which happened more than 52000 times. This problem was harsher in Firefox 24 than in Chrome 29 since Firefox would crash sooner (around 2000 ms sooner than Chrome). What I simply did and it worked was that I user "for" loops instead of each and then I refactored the code so that I divided the whole loop array to 4 separated calls and then merged the result into one. This solution has proven that it has worked.
Something like this:
var entittiesToLoop = ["..."]; // Mainly a big array
loopForSubset(0, firstInterval);
loopForSubset(firstInterval, secondInterval);
...
var loopForSubset = function (startIndex, endIndex) {
for (var i=startIndex; i < endIndex; i++) {
//Do your stuff as usual here
}
}
The other solution which also worked for me was the same solution implemented with Worker APIs from HTML5. Use the same concept in workers as they avoid your browser to be frozen because they run in the background of your main thread. If just applying this with Workers API did not work, place each of instances of loopForSubset in different workers and merge the result inside the main caller of Worker.
I mean this might not be perfect but this has worked. I can help with more real code chunks, if someone still thinks this might suite them.
You could try shortening the code by
$(xmlDoc).find("Object").each(function(arg1) {
(function(arg1_received) {
setTimeout(function(arg1_received_reached) {
//your stuff with the arg1_received_reached goes here
}(arg1_received), 0)
})(arg1)
}(this));
This won't harm you much ;)
As a modification of #tj111 answer the full usable code
//add pop and shift functions to jQuery library. put in somewhere in your code.
//pop function is now used here but you can use it in other parts of your code.
(function( $ ) {
$.fn.pop = function() {
var top = this.get(-1);
this.splice(this.length-1,1);
return top;
};
$.fn.shift = function() {
var bottom = this.get(0);
this.splice(0,1);
return bottom;
};
})( jQuery );
//the core of the code:
var $div = $('body').find('div');//.each();
var s= $div.length;
var mIndex = 0;
var process = function() {
var $div = $div.first();
//here your own code.
//progress bar:
mIndex++;
// e.g.: progressBar(mIndex/s*100.,$pb0);
//start new iteration.
$div.shift();
if($div.size()>0){
setTimeout(process, 5);
} else {
//when calculations are finished.
console.log('finished');
}
}
process();

Categories

Resources