How does work app.use in KOA?
When I set some generator inside app.use, everything works perfect.
How can I do the same elsewhere?
When I just execute generator manual:
var getRelationsList = function *() {
var res = yield db.relations.find({});
console.log({'inside: ': res});
}
console.log({'outside: ': getRelationsList().next()});
getRelationsList().next();
I'm getting just { 'outside: ': { value: [Function], done: false } }
This what I expect is:
{ 'outside: ': { value: {object_with_results}, done: false } }
{ 'inside: ': {object_with_results}
EDIT
I changed my code like that:
var getRelationsList = function *() {
var res = yield db.relations.find({});
console.log({'inside: ': res});
}
console.log({'outside ': co(getRelationsList)});
Now inside console log show's me good results but outside console log shows me just empty object.
Generators are a powerful tool for organizing asynchronous code, but they don't magically wait for asynchronous code to run.
What's Happening
Let's walk through your code so you can see what is happening:
getRelationsList is a generator function, that when called returns a new generator. At this point, no code in your generator function has been called (although if you were passing params they would be set). You then call .next on your generator to start execution of the generator function. It will execute up until it hits the first yield statement and return an object with the yielded value and the completion status of the generator.
It seems you understand most of that so far, but generators do not magically transform the yielded out values. When you yield out db.relations.find({}), you'll get the return value of the find function which I'm assuming is a Promise or some type of thenable:
so your 'outside' value is { value:Promise, done:false }
The reason your inside console.log never ran is that you're actually creating a new generator each time you call getRelationsList(), so when you call getRelationsList().next() again after the outside console.log you're creating a new generator and calling next, so it only executes up to the first yield, just like the call on the previous line.
In order to finish execution you must call next twice on the same instance of your generator: once to execute up to the yield and once to continue execution to the end of the function.
var gen = getRelationsList()
gen.next() // { value:Promise, done:false }
gen.next() // { value:undefined, done:true } (will also console.log inside)
You'll notice, however, if you run this, the inside console.log will be undefined. That's because the value of a yield statement is equal to the value passed to the following .next() call.
For example:
var gen2 = getRelationsList()
gen2.next() // { value:Promise, done:false }
gen2.next(100) // { value:undefined, done:true }
Outputs
{ inside:100 }
Because we passed 100 to the second .next() call and that became the value of the yield db.relations.find({}) statement which was then assigned to res.
Here's a link demoing all of this: http://jsfiddle.net/qj1aszub/2/
The Solution
The creators of koa use a little library called co which basically takes yielded out promises and waits for them to complete before passing the resolved value back into the generator function (using the .next() function) so that you can write your asynchronous code in a synchronous style.
co will return a promise, which will require you to call the .then method on to get the value returned from the generator function.
var co = require('co');
var getRelationsList = function *() {
var res = yield db.relations.find({});
console.log({'inside: ': res});
return res
}
co(getRelationsList).then(function(res) {
console.log({'outside: ': res })
}).catch(function(err){
console.log('something went wrong')
});
co also allows you to yield out other generator functions and wait for their completion, so you don't have to wrap things with co at every level and deal with promises until you're at some sort of 'top level':
co(function *() {
var list = yield getRelationsList()
, processed = yield processRelations(list)
, response = yield request.post('/some/api', { data:processed })
return reponse.data
}).then(function(data) {
console.log('got:', data)
})
Your problem is that you call getRelationsList() function multiple times which is incorrect.
Change your code to following
var g = getRelationsList();
console.log('outside: ', g.next());
g.next(); //You get your console.log('inside: .... here
Generators must be acted upon by outside code.
Under the hood koa use the co library to 'Run' the generator.
Here is how you might achieve what your wanting outside of koa:
var co = require('co');
var getRelationsList = function *() {
var res = yield db.relations.find({});
console.log({'inside: ': res});
}
co(getRelationsList).catch(function(err){});
I did a short screencast on JavaScript generators that should help you understand what's going on:
http://knowthen.com/episode-2-understanding-javascript-generators/
++ EDIT
If your using generators to program in more of an synchronous style (eliminating callbacks), then all your work needs to be done in the generator and you should use a library like co to execute the generator.
Here is a more detailed example of how you would interact with a generator, manually. This should help you understand the results your getting.
function * myGenerator () {
var a = yield 'some value';
return a;
}
var iterator = myGenerator();
// above line just instantiates the generator
console.log(iterator);
// empty object returned
// {}
var res1 = iterator.next();
// calling next() start the generator to either the
// first yield statement or to return.
console.log(res1);
// res1 is an object with 2 attributes
// { value: 'some value', done: false }
// value is whatever value was to the right of the first
// yield statment
// done is an indication that the generator hasn't run
// to completion... ie there is more to do
var toReturn = 'Yield returned: ' + res1.value;
var res2 = iterator.next(toReturn);
// calling next(toReturn) passes the value of
// the variable toReturn as the return of the yield
// so it's returned to the variable a in the generator
console.log(res2);
// res2 is an object with 2 attributes
// { value: 'Yield returned: some value', done: true }
// no further yield statements so the 'value' is whatever
// is returned by the generator.
// since the generator was run to completion
// done is returned as true
Related
I have a scenario, where I have got an array of URL's that needs to be evaluated synchronously using request npm module. In details, Array will be forEach and it should bring data for current url and after then only move to pick next url. I am using yield generators. But It's not working. Please help guys!
var Promise = require("bluebird");
var request = Promise.promisify(require("request"), {multiArgs: true});
Promise.promisifyAll(request, {multiArgs: true})
var url_arr = ["https://maps.googleapis.com/maps/api/geocode/json?address=1600+Amphitheatre+Parkway,+Mountain+View,+CA&key=AIzaSyDe3MyoHI6aSbYKdHOXloz9QepAMfes9XE", "https://maps.googleapis.com/maps/api/geocode/json?address=1600+Amphitheatre+Parkway,+Mountain+View,+CA&key=AIzaSyDe3MyoHI6aSbYKdHOXloz9QepAMfes9XE"];
function* fetch(obj) {
console.log("2")
var myobj = yield request.getAsync(obj).promise();
console.log("3", myobj)
return myobj;
}
url_arr.forEach(function (obj){
console.log("1", obj)
var output = fetch(obj);
console.log("4, ", output);
});
In above snippet, Only 1 && 4 are printed in console. Two and three are never evaluated I think.
Remove return statement from generator function, which suspends the generator, call .next() to get the value of the yielded object
function* _fetch(obj) {
var myobj = yield obj;
}
[1,2,3].forEach(function (obj) {
var {value:output} = _fetch(obj).next();
console.log("output:", output);
});
I'm using MySQL (mysql-co) and ASQ(asynquence) in a simple project to get a better understanding of ES6 generators and yield functions, and I'm stumped on an odd behavior.
Short explanation of asynquence
asynquence (https://github.com/getify/asynquence) provides an easy way for me to run generators in sequence. It can also do pseudo-parallel execution but that's not what I need for now. The structure of function *x(token) is from there. token holds a connection object at [0]. yield token passes control on to the next generator function in sequence.
Code Sample 1 (works)
function *test1(token) {
var conn = token.messages[0];
var values = {id:1, dev:1, description:'This is it!'};
yield conn.query("INSERT INTO version SET ?", values);
yield token;
}
This works fine. The row described above gets inserted. I didn't know the MySQL driver allowed such a simple looking insert function but it does.
Code Sample 2 (doesn't work)
function *test1(token) {
var conn = token.messages[0];
var values = {id:1, dev:1, description:'This is it!'};
yield subtest1(conn, values);
yield token;
}
function *subtest1(conn, values) {
yield conn.query("INSERT INTO version SET ?", values);
}
This doesn't work. The actual code in question for subtest1 is in a model class, so I would prefer not to have it merged in with the controller.
I've tried a bunch of different things around with or without yield on the subtest function.
What's going on?
subtest1(conn, values) is a generator. yielding a generator object does not execute its body. That is, the yielded generator remains suspended, and it would require a call to the next() method for the first yield to be reached. There are no explicit or implicit calls to next() in Code Sample 2, and this is the reason conn.query(...) isn't executed.
How about yield* subtest1(conn, values)? From the linked page:
The yield* expression iterates over the operand and yields each value
returned by it.
It will still execute subtest lazily.
An alternative solution is to turn subtest into a regular function and return the result of conn.query(...) (assuming you only need to perform one query):
function subtest1(conn, values) {
return conn.query("INSERT INTO version SET ?", values);
}
yield subtest1(conn, values) calls subtest1(conn, values), which returns an iterator object, and then yields that from test1 as the value for that iteration of test1. It doesn't iterate the values returned by the subtest1 iterator.
You could have test1 iterate the values from subtest1's iterator by adding a * after the yield:
yield* subtest1(conn, values);
but looking at your code, I don't think you want to. If you want to split the conn.query line out into a function, it would look like this:
function *test1(token) {
var conn = token.messages[0];
var values = {id:1, dev:1, description:'This is it!'};
yield subtest1(conn, values);
yield token;
}
function subtest1(conn, values) {
return conn.query("INSERT INTO version SET ?", values);
}
This simpler version may help with the difference between yield and yield*:
function* foo() {
console.log("f");
yield bar();
console.log("f done");
}
function* bar() {
console.log("b1");
let i = 0;
while (i < 3) {
console.log("b2");
yield i;
++i;
}
console.log("b done");
}
for (let v of foo()) {
console.log(v);
}
The output of that is: (live copy on Babel's REPL)
f
{}
f done
Note how we don't see any of the "b" logging at all; that's because the {} you see after f1 is the iterator object returned by calling bar.
If we just add * on the yield bar(); line, turning it into yield* bar(), the output changes: (live copy)
f
b1
b2
0
b2
1
b2
2
b done
f done
If the operand to yield* is an iterator, yield* iterates it, yielding each of its values. It's basically:
for (value of bar()) {
yield value;
}
I am trying to learn how to use generators and yield, so I tried the following but it doesn't seem to be working.
I am using the following function, which contains 2 async calls:
var client = require('mongodb').MongoClient;
$db = function*(collection, obj){
var documents;
yield client.connect('mongodb://localhost/test', function*(err, db){
var c = db.collection(collection);
yield c.find(obj).toArray(function(err, docs){
documents = docs;
db.close();
});
});
return documents.length;
};
Then to make the call original call, I am doing this:
var qs = require("querystring");
var query = qs.parse("keywords[]=abc&keywords[]=123");
var total = $db("ads", {"details.keywords": {$in: query["keywords[]"]}});
console.log(total);
When I get my output back in the console, I get this:
{}
I was expecting a number such as 200. What is it that I am doing wrong?
TL;DR
For the short answer, you're looking for a helper like co.
var co = require("co");
co(myGen( )).then(function (result) { });
But Why?
There is nothing inherently asynchronous about ES6 iterators, or the generators which define them.
function * allIntegers ( ) {
var i = 1;
while (true) {
yield i;
i += 1;
}
}
var ints = allIntegers();
ints.next().value; // 1
ints.next().value; // 2
ints.next().value; // 3
The .next( ) method, though, actually lets you send data back in to the iterator.
function * exampleGen ( ) {
var a = yield undefined;
var b = yield a + 1;
return b;
}
var exampleIter = exampleGen();
exampleIter.next().value; // undefined
exampleIter.next(12).value; // 13 (I passed 12 back in, which is assigned to a)
exampleIter.next("Hi").value; // "Hi" is assigned to b, and then returned
It might be confusing to think about, but when you yield it's like a return statement; the left hand side hasn't been assigned the value yet... ...and more importantly, if you had put the var y = (yield x) + 1; the parenthesis are resolved before the rest of the expression... ...so you return, and the +1 is put on hold, until a value comes back.
Then when it arrives (passed in, via the .next( )), the rest of the expression is evaluated (and then assigned to the left hand side).
The object that's returned from each call has two properties { value: ..., done: false }
value is what you've returned/yielded and done is whether or not it's hit the actual return statement at the end of the function (including implicit returns).
This is the part that can then be used to make this async magic happen.
function * asyncGen ( id ) {
var key = yield getKeyPromise( id );
var values = yield getValuesPromise( key );
return values;
}
var asyncProcess = asyncGen( 123 );
var getKey = asyncProcess.next( ).value;
getKey.then(function (key) {
return asyncProcess.next( key ).value;
}).then(function (values) {
doStuff(values);
});
There's no magic.
Instead of returning a value, I'm returning a promise.
When the promise completes, I'm pushing the result back in, using .next( result ), which gets me another promise.
When that promise resolves, I push that back in, using .next( newResult ), et cetera, until I'm done.
Can we do better?
We know now that we're just waiting for promises to resolve, then calling .next on the iterator with the result.
Do we have to know, ahead of time what the iterator looks like, to know when we're done?
Not really.
function coroutine (iterator) {
return new Promise(function (resolve, reject) {
function turnIterator (value) {
var result = iterator.next( value );
if (result.done) {
resolve(result.value);
} else {
result.value.then(turnIterator);
}
}
turnIterator();
};
}
coroutine( myGen ).then(function (result) { });
This isn't complete and perfect. co covers extra bases (making sure all yields get treated like promises, so you don't blow up by passing a non-promise value... ...or allowing arrays of promises to be yielded, which becomes one promise which will return the array of results for that yield ...or try/catch around the promise handling, to throw the error back into the iterator... yes, try/catch works perfectly with yield statements, done this way, thanks to a .throw(err) method on the iterator).
These things aren't hard to implement, but they make the example muddier than it needs to be.
This is exactly why co or some other "coroutine" or "spawn" method is perfect for this stuff.
The guys behind the Express server built KoaJS, using Co as a library, and Koa's middleware system just takes generators in its .use method and does the right thing.
But Wait, there's more!
As of ES7, it's very likely that the spec will add language for this exact use-case.
async function doAsyncProcess (id) {
var key = await getKeyPromise(id);
var values = await getValuesPromise(key);
return values;
}
doAsyncProcess(123).then(values => doStuff(values));
The async and await keywords are used together, to achieve the same functionality as the coroutine-wrapped promise-yielding generator, without all of the external boilerplate (and with engine-level optimizations, eventually).
You can try this today, if you're using a transpiler like BabelJS.
I hope this helps.
Yield and generators have nothing to do with asynchrony, their primary purpose is to produce iterable sequences of values, just like this:
function * gen() {
var i = 0;
while (i < 10) {
yield i++;
}
}
for (var i of gen()) {
console.log(i);
}
Just calling a function with a star (generator function) merely creates generator object (that is why you see {} in console), that can be interacted with using next function.
That said, you can use generator functions as an analogue of asynchronous functions, but you need a special runner, like co.
var client = require('mongodb').MongoClient;
$db = function*(collection, obj){
var documents;
yield client.connect('mongodb://localhost/test', function*(err, db){
var c = db.collection(collection);
yield c.find(obj).toArray(function(err, docs){
documents = docs;
db.close();
});
});
return documents.length;
};
var qs = require("querystring");
var query = qs.parse("keywords[]=abc&keywords[]=123");
var total = $db("ads", {"details.keywords": {$in: query["keywords[]"]}});
console.log(total);
As is, total is the iterator for the $db generator function. You would retrieve its yield values via total.next().value. However, the mongodb library is callback based and as such, its functions do not return values, so yield will return null.
You mentioned you were using Promises elsewhere; I would suggest taking a look at the bluebird in particular its promisify functionality. Promisification inverts the callback model so that the arguments to the callback are now used to resolve the promisified function. Even better, promisifyAll will convert an entire callback based API.
Finally, bluebird provides coroutine functionality as well; however its coroutines must return promises. So, your code may be rewritten as follows:
var mongo = require('mongodb');
var Promise = require('bluebird');
//here we convert the mongodb callback based API to a promised based API
Promise.promisifyAll(mongo);
$db = Promise.coroutine(function*(collection, obj){
//existing functions are converted to promised based versions which have
//the same name with 'Async' appended to them
return yield mongo.MongoClient.connectAsync('mongodb://localhost/test')
.then(function(db){
return db.collectionAsync(collection);})
.then(function(collection) {
return collection.countAsync();});
});
var qs = require("querystring");
var query = qs.parse("keywords[]=abc&keywords[]=123");
$db('ads',{"details.keywords": {$in: query["keywords[]"]}})
.then(console.log)
I am trying to use jQuery's AJAX deferreds to return a JSON string that can be parsed and used but I have coded myself into a corner and may have screwed up the logic. I expected the results of the AJAX call to be returned in the .done() callback and they are. I thought once done that I could return the result for use in the remainder of the function. I know that I'm missing something really obvious and simple, I just cannot put a finger on what it is.
Here is the initial coding of the function, stil very much in test mode. The JSON is correctly returned in the .done() function but I cannot assign it outside of the function for use.
checkUserRoles = function(){
var userRole, foo, roles, i$, len$, i;
userRole = roleChecker();
foo = userRole.done(function(data){
var bar;
bar = foo.responseText; // correctly returns the JSON data
console.log(bar);
return bar; // is undefined, this is the problem
});
if (userRole !== false) {
userRole = jsonDecode(userRole);
} else {
userRole = "";
}
roles = userRole.split(',');
$("if-user-role").hide();
for (i$ = 0, len$ = roles.length; i$ < len$; ++i$) {
i = roles[i$];
$("if-user-role[data-role~=" + i + "]").show();
}
return true;
};
this.roleChecker = function(){
var retVal, allCookies, i$, len$, thecookie, hashedCookie, theHash, userHash, post;
retVal = "";
allCookies = document.cookie.split(';');
for (i$ = 0, len$ = allCookies.length; i$ < len$; ++i$) {
thecookie = allCookies[i$];
if (thecookie.indexOf('userHash') >= 0) {
hashedCookie = thecookie;
theHash = hashedCookie.split('=');
userHash = theHash[1];
}
}
post = $.ajax({
url: '/services/check_hash.php',
data: {
hash: userHash
},
type: "POST"
});
return post;
};
The code that you see here is the output from the compiling of LiveScript which we use extensively. I don't think the LiveScript is having an effect on the final result, I just had to do a lot to get what I expected would be the proper JavaScript / jQuery output.
NOTE: because this is more or less the first pass at the code foo doesn't get passed along to the subsequent if statement as userRole was originally hard-coded for the initial testing prior to trying to make the function more dynamic.
How do I return foo.responseText or bar for use in the subsequent procedure? Do I need to put the procedure, beginning with the if conditional in the .done() function?
You're looking for .then and not .done.
What .done does is perform an action and return the same promise. On the other hand then returns a new promise which is resolved with the return value of the callback provided to it. (Assuming the $.ajax resolved correctly).
You of course then need to place everything you subsequently do in the chain:
userRole.then(function(data){
var bar;
bar = foo.responseText; // correctly returns the JSON data
console.log(bar);
return bar;
}).then(function(role){;
if (role != false) {
role = jsonDecode(userRole);
} else {
userRole = "";
}
//...
return true;
});
You should also return that promise to hook on it later.
It looks like you are using deferred objects synchronously, which (as you mentioned in your title) is not the intended purpose. The code that you process the user data with after .done() will execute immediately after registering that handler, so your data won't be ready yet.
When you register a .then() on a deferred promise, you're telling your program to run a piece of code after the deferred object has either resolved or rejected. The program will not wait until that deferred object has resolved or rejected, it will continue processing code (which is the beauty of the deferred object system!)
Example:
var checkUserRoles = function () {
var userRole = roleChecker();
// go off and find the user data
// but *don't wait for it before continuing code execution*
userRole.then(function(data){
// at this point the AJAX request has finished, and we have the data
console.log(data);
// put code to process the user data here
});
// code here is executed immediately after registering the .then handler
// so the user data has not loaded yet
};
var roleChecker = function () {
var defer = $.Deferred();
var post = defer.promise();
// simulate an AJAX Request, returns after 2 seconds
setTimeout(function () {
defer.resolve('user data here!');
}, 2000);
return post;
};
checkUserRoles();
I've read over several examples of code using JavaScript generators such as this one. The simplest generator-using block I can come up with is something like:
function read(path) {
return function (done) {
fs.readFile(path, "file", done);
}
}
co(function *() {
console.log( yield read("file") );
})();
This does indeed print out the contents of file, but my hangup is where done is called. Seemingly, yield is syntactic sugar for wrapping what it returns to in a callback and assigning the result value appropriately (and at least in the case of co, throwing the error argument to the callback). Is my understanding of the syntax correct?
What does done look like when yield is used?
Seemingly, yield is syntactic sugar for wrapping what it returns to in a callback and assigning the result value appropriately (and at least in the case of co, throwing the error argument to the callback)
No, yield is no syntactic sugar. It's the core syntax element of generators. When that generator is instantiated, you can run it (by calling .next() on it), and that will return the value that was returned or yielded. When the generator was yielded, you can continue it later by calling .next() again. The arguments to next will be the value that the yield expresion returns inside the generator.
Only in case of co, those async callback things (and other things) are handled "appropriately" for what you would consider natural in an async control flow library.
What does done look like when yield is used?
The thread function example from the article that you read gives you a good impression of this:
function thread(fn) {
var gen = fn();
function next(err, res) {
var ret = gen.next(res);
if (ret.done) return;
ret.value(next);
}
next();
}
In your code, yield does yield the value of the expression read("file") from the generator when it is ran. This becomes the ret.val, the result of gen.next(). To this, the next function is passed - a callback that will continue the generator with the result that was passed to it. In your generator code, it looks as if the yield expression returned this value.
An "unrolled" version of what happens could be written like this:
function fn*() {
console.log( yield function (done) {
fs.readFile("filepath", "file", done);
} );
}
var gen = fn();
var ret1 = gen.next();
var callasync = ret1.value;
callasync(function next(err, res) {
var ret2 = gen.next(res); // this now does log the value
ret2.done; // true now
});
I posted a detailed explanation of how generators work here.
In a simplified form, your code might look like this without co (untested):
function workAsync(fileName)
{
// async logic
var worker = (function* () {
function read(path) {
return function (done) {
fs.readFile(path, "file", done);
}
}
console.log(yield read(fileName));
})();
// driver
function nextStep(err, result) {
try {
var item = err?
worker.throw(err):
worker.next(result);
if (item.done)
return;
item.value(nextStep);
}
catch(ex) {
console.log(ex.message);
return;
}
}
// first step
nextStep();
}
workAsync("file");
The driver part of workAsync asynchronously iterates through the generator object, by calling nextStep().