Is there a way to "outsource" yield statements in JavaScript? - javascript

To give you an idea what I'm trying to do:
As someone who teaches sorting algorithms, I'd like to enable students to easily visualize how their sorting algorithms work and found generators to be realy useful for that, sice I can interrupt the execution at any point: The following code that a student could write can be turned into an animation by my library:
function* bubbleSort(arr){
let done = false;
while(!done){
done = true;
for(let i=1;i<arr.length;i++){
yield {type:"comparison",indexes:[i-1,i]};
if(arr[i-1]>arr[i]){
yield {type:"swap",indexes:[i-1,i]};
swap(arr,i-1,i);
done = false;
}
}
}
return "sorted";
}
This works quite well, but it would be a lot better, if I could write functions compare(i,j) and swap(i,j) that handle the yield internally (and in case of compare also return a boolean value). So I'd like to be able to express the above as:
function* bubbleSort(arr){
let done = false;
while(!done){
done = true;
for(let i=1;i<arr.length;i++){
if(compare(i-1,i)){
swap(arr,i-1,i);
done = false;
}
}
}
return "sorted";
}
Is there a way to do this?

You could just do
if(yield* compare(i-1,i))
Which will pass the yield calls inside of compare to the outside.

Related

Calling functions inside generator functions, but "Generator is already running"

I am coding a project for a simple football game in JavaScript. When a player hikes the ball, I am attempting to run a series of functions in order to validate a legal snap. To do this, I am using a generator function, so that I can organize all the functions that run, as the order in which they run is important. Essentially, I run the generator function once using the snap() function, and then at the conclusion of each Check() function, I either return validateSnap.next() if it is a legal snap, or a fail function to exit out of the generator and handle an illegal snap. Here is a simplified version of my code below:
function* snapProtocol() {
yield check1();
yield check2();
yield check3();
yield check4();
yield check5();
yield play();
}
let validateSnap = snapProtocol()
function snap() {
validateSnap.next();
}
function check1() {
let meetsCriteria = true;
if (meetsCriteria) {
validateSnap.next();
} else {
handleError();
}
}
I am receiving a "Generator is already running" error. I presume this is because the check1 function has not finished, but when I add a callback function, I get the same error. Why is this occurring? Is there a simpler method to accomplish this? Previously, I would run each check function and then have it return either a true or false value, with true if it was a legal snap to go to the next check function, or false to stop the execution of the initial function. This required to me declare a bunch of variables as well as have an if statement after every function, so I was looking for a cleaner approach.
You are try to call next() inside check function which is in progress. You can't call next() during this function end. Shortly you can not call next() inside function which is returned from generator.
Maybe this example can be helpful.
function* snapProtocol() {
yield check1();
yield check1();
yield check1();
yield check1();
yield check1();
}
let validateSnap = snapProtocol();
function snap() {
let result = null
do {
result = validateSnap.next()
if ( result.done ) {
// all creterias was meet and play could be call
break;
}
if ( result.value ) {
// meets criteria and cen check another one
continue
} else {
// doesn't meet createria and somthing should happend here
break;
}
} while( !result.done )
}
function check1() {
let meetsCriteria = true
if (meetsCriteria) {
return true
} else {
return false
}
}

Generators + async/await, yielding asynchronously [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I have 2 very related questions. One practical, relating to a real problem I have, and one more theoretical. Let me start with the latter:
Q1: Does async/await + generators make sense?
In general, it doesn't make too much sense to have yield inside of callbacks or promises. The problem (among others) is there could be races between multiple yields, it's impossible to tell when the generator ends, etc.
function* this_is_nonsense() {
setTimeout(() => { yield 'a'; }, 500);
setTimeout(() => { yield 'b'; }, 500);
return 'c';
}
When using async/await these problems seem to actually mostly go away
function timeout(ns) {
return new Promise((resolve) => setTimeout(resolve, ns));
}
async function* this_could_maybe_make_sense() {
// now stuff is sequenced instead of racing
await timeout(500);
yield 'a';
await timeout(500);
yield 'b';
return 'c';
}
I assume nothing like this is currently possible (please correct me if wrong!). So my first question is, is there a conceptual issue with using generators with async/await? If not, is it on any kind of roadmap? It seems like implementation would require more than just mashing together the two features.
Q2: How to write lazy wiki crawler
This is a contrived version of a real code problem I have.
Imagine writing a function that does traversal of Wikipedia, starting from some arbitrary page and repeatedly following links (doesn't matter how - can be depth first, breadth first or something else). Assume we can asynchronously crawl Wikipedia pages to get linked pages. Something like:
async function traverseWikipedia(startLink) {
const visited = {};
const result = [];
function async visit(link) {
if (visited[link]) { return; }
visited[link] = true;
result.push(link);
const neighboringLinks = await getNeighboringWikipediaLinks(link);
for (const i = 0; i < neighboringLinks.length; i++) {
await visit(neighboringLinks[i]);
}
// or to parallelize:
/*
await Promise.all(
neighboringLinks.map(
async (neighbor) => await visit(neighbor)
)
);
*/
);
await visit(startLink);
return result;
}
This is fine... except that sometimes I only want the first 10 links, and this is gonna crawl over tons of links. Sometimes I'm searching for the first 10 that contains the substring 'Francis'. I want a lazy version of this.
Conceptually, instead of a return type of Promise<Array<T>>, I want Generator<T>, or at least Generator<Promise<T>>. And I want the consumer to be able to do arbitrary logic, including asynchronous stopping condition. Here's a fake, totally wrong (i assume) version:
function* async traverseWikipedia(startLink) {
const visited = {};
function* async visit(link) {
if (visited[link]) { return; }
visited[link] = true;
yield link;
const neighboringLinks = await getNeighboringWikipediaLinks(link);
for (const i = 0; i < neighboringLinks.length; i++) {
yield* await visit(neighboringLinks[i]);
}
);
yield* await visit(startLink);
}
This is the same nonsense from Q1, combining async/await + generators as if it did what I want. But if I did have it, I could consume like this:
function find10LinksWithFrancis() {
const results = []
for (const link of traverseWikipedia()) {
if (link.indexOf('Francis') > -1) {
results.push(link);
}
if (results.length === 10) {
break;
}
}
return results;
}
But I could also search for different number of results and strings, have asynchronous stopping conditions, etc. For example, it could show 10 results, and when the user presses a button, keeps crawling and shows the next 10.
I'm okay with just using promises/callbacks, without async/await, as long as I still get the same API for the consumer.
The caller of the generator can do asynchronous things. So I think it's possible to implement this, where the generator function yields the promise of neighbors, and it's up to the caller to call gen.next with the resulting actual neighbors. Something like:
function* traverseWikipedia(startLink) {
const visited = {};
function* visit(link) {
if (visited[link]) { return; }
visited[link] = true;
yield { value: link };
const neighboringLinks = yield { fetch: getNeighboringWikipediaLinks(link) };
for (const i = 0; i < neighboringLinks.length; i++) {
yield* visit(neighboringLinks[i]);
}
);
yield* visit(startLink);
}
But that's not very nice to use... you have to detect the case of an actual result vs. the generator asking you to give it back an answer, it's less typesafe, etc. I just want something that looks like Generator<Promise<T>> and a relatively straightforward way to use it. Is this possible?
Edited to add: it occurred to me you could implement this if the body of the generator function could access the Generator it returned somehow... not sure how to do that though
Edited again to add example of consumer, and note: I would be okay with the consumer being a generator also, if that helps. I'm thinking it might be possible with some kind of engine layer underneath, reminiscent of redux-saga. Not sure if anyone has written such a thing

Animating a recursive backtracking algorithm in javascript

I'm trying to create a live demo of a backtracking algorithm (with simple forward checking) in javascript. I've gotten the algorithm down pat in its recursive form, but now I'm stuck trying to animate it using javascript's setTimeout or setInterval, which I'm assuming would require me to convert the recursive solution to an iterative one. Here's the function (rewritten to be a little more general):
function solve(model) {
if (model.isSolved()) return true;
var chosen = chooseVariable(model); //could be random or least constrained
var domain = model.getDomain(chosen);
var i, assn;
for (i = 0; i < domain.length; i++) {
assn = domain[i];
model.set(chosen, assn);
if (solve(model)) return true;
else model.undo();
}
return false;
}
As you can see, I've made it so that the model can undo it's own actions, rather than having a separate action stack or cloning the model at each level of recursion. Is there a way to convert the function above into one that could be used with setTimeout or setInterval? Would I have to significantly change the model/add another stack to keep track of the chosen variable/attempted assignments? Do I need a closure with mutating variables? I'm mostly looking for pseudocode to point me in the right direction.
I'm assuming this require me to convert the recursive solution to an iterative one.
No, right the other way round. Yours still is iterative in some parts (the for-loop).
You will have to make the steps asynchronous, so that each step takes a callback which is fired when its animation is done and you can continue. Since you will want to animate every single iteration step, you will have to make them asynchronous with a recursive-like callback - continuation passing style.
Here's how:
function solve(model, callback) {
if (model.isSolved())
return callback(true);
var chosen = chooseVariable(model); // could be random or least constrained
var domain = model.getDomain(chosen);
var i = 0, assn;
(function nextStep() {
if (i < domain.length) {
assn = domain[i];
model.set(chosen, assn);
solve(model, function(solved) {
if (solved)
callback(true);
else {
model.undo();
i++;
nextStep();
}
});
} else
callback(false);
})();
}
Now you can simply make this recursive variant asynchronous by introducing setTimeout where you need it (usually after displaying the model state):
function solve(model, callback) {
if (model.isSolved())
return callback(true);
var chosen = chooseVariable(model); // could be random or least constrained
var domain = model.getDomain(chosen);
var i = 0, assn;
(function nextStep() {
if (i < domain.length) {
assn = domain[i];
model.set(chosen, assn);
solve(model, function(solved) {
if (solved)
callback(true);
else {
model.undo();
i++;
setTimeout(nextStep, 100);
}
});
} else
setTimeout(callback, 100, false);
})();
}
You could program it asynchronously using for example deferreds. jQuery provides an implementation of deferreds and you could have a look at this example which uses timeouts:
http://api.jquery.com/deferred.promise/#example-0
Of course you need only one timeout which always resolves (succeeds).

Are generators really intrusive

As I understand the current spec for Javascript generators, you have to mark functions containing yields explicitly.
I wonder what the rationate behind this is.
If this is true, it would force people to write:
let thirdfunc = function*() {
let value = 5;
let other = yield 6;
return value;
};
let secondfunc = function*() {
yield thirdfunc();
};
let firstfunc = function*() {
yield secondfunc();
};
let gen = function*() {
// some code
// more code
yield firstfunc();
// and code
};
let it = gen();
while( !it.done() ) {
it.next();
};
Which means, generators would spread like cancer in a codebase.
While in the end, to the developer only yielding and handling the iterator is really interesting.
I would find it much more practical, to just define, where I want to handle the iteration.
let thirdfunc = function() {
let value = 5;
let other = yield 6; // Change 1: incorporate yield
return value;
};
let secondfunc = function() {
thirdfunc();
};
let firstfunc = function() {
secondfunc();
};
let gen = function*() { // Change 2: at this level I want to deal with descendant yields
// some code
// more code
firstfunc();
// and code
};
// Change 3: Handle iterator
let it = gen();
while( !it.done() ) {
it.next();
}
If the browser then has to turn everything between the yield call and the generator handler (firstfunc, secondfunc, thirdfunc) into promise / future form, that should work automagically and not be the business of Javascript developers.
Or are there really good arguments for not doing this?
I described the rationale for this aspect of the design at http://calculist.org/blog/2011/12/14/why-coroutines-wont-work-on-the-web/ -- in short, full coroutines (which is what you describe) interfere with the spirit of JavaScript's run-to-completion model, making it harder to predict when your code can be "pre-empted" in a similar sense to multithreaded languages like Java, C#, and C++. The blog post goes into more detail and some other reasons as well.

How to refactor the javascript code when sleep like functionality needed?

I see there are lot's of threads here in SO about asking for a javascript sleep function and I know it can be done only using setTimeout and setInterval.
I do some userscripting with greasemonkey and written a script that loads a lot of pages and calculates something from them. It works, but I don't want to request the pages too fast.
var html0=syncGet(url0); // custom function for sync ajax call.
// fill the something array
for(var i=0;i<something.length;i++)
{
// calculate url1,url2 using the array and the i variable
// do something with lots of local variables
var html1=syncGet(url1);
// I would put a sleep here.
// do something with the results
var html2=syncGet(url2);
// I would put a sleep here.
// do something with the results
// get url3 from the page loaded from url2
var html3=syncGet(url3);
// I would put a sleep here.
// do something with the results
}
// use the result of the for loop and lots of code will follow...
The actual code is a bit more complex and longer than this.
I'm crying for the nonexistent sleep function (and understand why is it not possible) How to refactor this to use setTimeout, setInterval functions and keep it readable (and working) too?
For example this:
var urls = ["your","u","r","l´s"];
var htmls = new Array(urls.length);
var time = 1000;
for(var i=0;i<urls.length;i++){
(function(i){
setTimeout(function(){
htmls[i] = syncGet(urls[i]);
if(i == urls.length-1){
//continue here
}
},time*i);
})(i);
}
I had a similar problem where a big loop was blocking the whole browser in some older browsers, I solved it using :
function handlenext(idx,length) {
idx++
//do your stuff here base on idx.
if (idx < length) {
setTimeout(function(){handlenext(idx,length)},1)
} else {
initSuccessEnd()
}
}
var ln = something.length;
if (ln>0) {
handlenext(0,ln);
} else {
initSuccessEnd()
}
here initSuccessEnd is a callback function called when all is finished ..
After a research I think Mozilla's new iterator-generator stuff could be the most apropriate. (It's supported since FF2)
function doSomething()
{
//.....
var html=syncGet(url1);
yield true;
var html2=syncGet(url2);
yield true;
var html3=syncGet(url3);
yield true;
//......
yield false;
}
function iteratorRunner(iterator,timeout)
{
if (iterator.next())
{
setTimeout(function(){iteratorRunner(iterator,timeout)},timeout);
}
else
{
iterator.close();
}
}
var iterator=doSomething(); // returns an iterator immediately
iteratorRunner(iterator,1000); // runs the iterator and sleeps 1 second on every yield.
I hope greasemonkey will handle that...

Categories

Resources