request-promise + co API trigger in protractor test - javascript

I want to call module.api.create from my protractor test. Referring this solution:-
Chain multiple Node http request I am using request-promise + co like this:-
//api/module1.js
var co = require('co');
var rp = require('request-promise');
exports.create = co(function* def() {
var response, token;
urlLogin.body.username = username;
response = yield rp(urlLogin);
//extract token and run other APIs
...
}).catch(err => console.log);
And
//api/api.js
var module1= require('./module1'),
exports.module1= function (){
return module1;
};
In my Spec/Test I am adding
api = require('../../api/api');
api.module1.create;
Issue i am facing is than even without calling the "api.module1.create;" line, the require line "api = require('../../api/api');" is calling the create automatically every-time the test is executed

From the co README:
co#4.0.0 has been released, which now relies on promises. It is a stepping stone towards the async/await proposal. The primary API change is how co() is invoked. Before, co returned a "thunk", which you then called with a callback and optional arguments. Now, co() returns a promise.
I believe you're looking for co.wrap, which returns a function that executes the generator and returns a promise (this function may also be known as a thunk). Using just co eagerly executes the generator and returns the result of executing the generator.
const co = require('co')
co(function* () {
// this will run
console.log('hello from plain co!')
})
co.wrap(function* () {
// this won't run because we never call the returned function
console.log('hello from wrapped co 1!')
})
const wrappedfn = co.wrap(function* () {
// this runs because we call the returned function
console.log('hello from wrapped co 2!')
})
wrappedfn()
You can also wrap a function by yourself, which does the same thing as co.wrap and lets you do more stuff afterwards.
exports.create = function() {
return co(function* () {
// this will run only when exports.create is called
console.log('hello from plain co!')
})
// If you want to do stuff after and outside the generator but inside the enclosing function
.then(...)
}

Related

mongodb-es6 promise not working

I am playing around mongodb-es6-master, in which it was written that mongoClient.connect() returns a promise, but i am not able to invoke the 'then' function on that promise.
i have a file named index.js
which i am invoking with node version 5.0.
following is the code snippet. Am i missing anything?
'use strict'
var co = require('co');
var MongoClient = require('mongodb-es6').MongoClient;
co(function* (){
let client = yield new MongoClient("mongodb://localhost:27017/test", {}).connect();
var collectionP = yield client.then((value) => {
//some code
// console throws [TypeError]client.then in not a function

MochaJS setTimeout ES6

While unit testing my Node.js app, I encountered a problem with Mocha and ES6 while using setTimeout.
Mocha said the test passed, but when I put in something else (to check the test, to make sure it works), it still says it passed, while it should fail.
Code:
describe('.checkToken', function () {
let user = {};
let token = repository.newToken();
it('token has expired', co.wrap(function* () {
setTimeout(function* () {
let result = yield repository.checkToken(user, token.token);
result.body.should.have.property("error");
}, 1000)
}));
});
});
The other tests are all working and there is no problem in that case.
I've already tried an arrow function or a standard function in the callback of setTimeout, but it then crashes on the yield. (Unexpected token)
checkToken is a generator function.
Using:
Nodejs v4.2.1
Co v4.6.0
Should v7.1.0
Mocha v2.3.3
You cannot use setTimeout with a generator. It's the generator that you pass to co.wrap that will be ran asynchronously, and it needs to know about the timeout. You will need to yield the timeout (as something yieldable, like a thunk or a promise):
it('token has expired', co.wrap(function* () {
yield new Promise(resolve => { setTimeout(resolve, 1000); });
let result = yield repository.checkToken(user, token.token);
result.body.should.have.property("error");
}));

Generators in KOA

How does work app.use in KOA?
When I set some generator inside app.use, everything works perfect.
How can I do the same elsewhere?
When I just execute generator manual:
var getRelationsList = function *() {
var res = yield db.relations.find({});
console.log({'inside: ': res});
}
console.log({'outside: ': getRelationsList().next()});
getRelationsList().next();
I'm getting just { 'outside: ': { value: [Function], done: false } }
This what I expect is:
{ 'outside: ': { value: {object_with_results}, done: false } }
{ 'inside: ': {object_with_results}
EDIT
I changed my code like that:
var getRelationsList = function *() {
var res = yield db.relations.find({});
console.log({'inside: ': res});
}
console.log({'outside ': co(getRelationsList)});
Now inside console log show's me good results but outside console log shows me just empty object.
Generators are a powerful tool for organizing asynchronous code, but they don't magically wait for asynchronous code to run.
What's Happening
Let's walk through your code so you can see what is happening:
getRelationsList is a generator function, that when called returns a new generator. At this point, no code in your generator function has been called (although if you were passing params they would be set). You then call .next on your generator to start execution of the generator function. It will execute up until it hits the first yield statement and return an object with the yielded value and the completion status of the generator.
It seems you understand most of that so far, but generators do not magically transform the yielded out values. When you yield out db.relations.find({}), you'll get the return value of the find function which I'm assuming is a Promise or some type of thenable:
so your 'outside' value is { value:Promise, done:false }
The reason your inside console.log never ran is that you're actually creating a new generator each time you call getRelationsList(), so when you call getRelationsList().next() again after the outside console.log you're creating a new generator and calling next, so it only executes up to the first yield, just like the call on the previous line.
In order to finish execution you must call next twice on the same instance of your generator: once to execute up to the yield and once to continue execution to the end of the function.
var gen = getRelationsList()
gen.next() // { value:Promise, done:false }
gen.next() // { value:undefined, done:true } (will also console.log inside)
You'll notice, however, if you run this, the inside console.log will be undefined. That's because the value of a yield statement is equal to the value passed to the following .next() call.
For example:
var gen2 = getRelationsList()
gen2.next() // { value:Promise, done:false }
gen2.next(100) // { value:undefined, done:true }
Outputs
{ inside:100 }
Because we passed 100 to the second .next() call and that became the value of the yield db.relations.find({}) statement which was then assigned to res.
Here's a link demoing all of this: http://jsfiddle.net/qj1aszub/2/
The Solution
The creators of koa use a little library called co which basically takes yielded out promises and waits for them to complete before passing the resolved value back into the generator function (using the .next() function) so that you can write your asynchronous code in a synchronous style.
co will return a promise, which will require you to call the .then method on to get the value returned from the generator function.
var co = require('co');
var getRelationsList = function *() {
var res = yield db.relations.find({});
console.log({'inside: ': res});
return res
}
co(getRelationsList).then(function(res) {
console.log({'outside: ': res })
}).catch(function(err){
console.log('something went wrong')
});
co also allows you to yield out other generator functions and wait for their completion, so you don't have to wrap things with co at every level and deal with promises until you're at some sort of 'top level':
co(function *() {
var list = yield getRelationsList()
, processed = yield processRelations(list)
, response = yield request.post('/some/api', { data:processed })
return reponse.data
}).then(function(data) {
console.log('got:', data)
})
Your problem is that you call getRelationsList() function multiple times which is incorrect.
Change your code to following
var g = getRelationsList();
console.log('outside: ', g.next());
g.next(); //You get your console.log('inside: .... here
Generators must be acted upon by outside code.
Under the hood koa use the co library to 'Run' the generator.
Here is how you might achieve what your wanting outside of koa:
var co = require('co');
var getRelationsList = function *() {
var res = yield db.relations.find({});
console.log({'inside: ': res});
}
co(getRelationsList).catch(function(err){});
I did a short screencast on JavaScript generators that should help you understand what's going on:
http://knowthen.com/episode-2-understanding-javascript-generators/
++ EDIT
If your using generators to program in more of an synchronous style (eliminating callbacks), then all your work needs to be done in the generator and you should use a library like co to execute the generator.
Here is a more detailed example of how you would interact with a generator, manually. This should help you understand the results your getting.
function * myGenerator () {
var a = yield 'some value';
return a;
}
var iterator = myGenerator();
// above line just instantiates the generator
console.log(iterator);
// empty object returned
// {}
var res1 = iterator.next();
// calling next() start the generator to either the
// first yield statement or to return.
console.log(res1);
// res1 is an object with 2 attributes
// { value: 'some value', done: false }
// value is whatever value was to the right of the first
// yield statment
// done is an indication that the generator hasn't run
// to completion... ie there is more to do
var toReturn = 'Yield returned: ' + res1.value;
var res2 = iterator.next(toReturn);
// calling next(toReturn) passes the value of
// the variable toReturn as the return of the yield
// so it's returned to the variable a in the generator
console.log(res2);
// res2 is an object with 2 attributes
// { value: 'Yield returned: some value', done: true }
// no further yield statements so the 'value' is whatever
// is returned by the generator.
// since the generator was run to completion
// done is returned as true

Understanding code flow with yield/generators

I've read over several examples of code using JavaScript generators such as this one. The simplest generator-using block I can come up with is something like:
function read(path) {
return function (done) {
fs.readFile(path, "file", done);
}
}
co(function *() {
console.log( yield read("file") );
})();
This does indeed print out the contents of file, but my hangup is where done is called. Seemingly, yield is syntactic sugar for wrapping what it returns to in a callback and assigning the result value appropriately (and at least in the case of co, throwing the error argument to the callback). Is my understanding of the syntax correct?
What does done look like when yield is used?
Seemingly, yield is syntactic sugar for wrapping what it returns to in a callback and assigning the result value appropriately (and at least in the case of co, throwing the error argument to the callback)
No, yield is no syntactic sugar. It's the core syntax element of generators. When that generator is instantiated, you can run it (by calling .next() on it), and that will return the value that was returned or yielded. When the generator was yielded, you can continue it later by calling .next() again. The arguments to next will be the value that the yield expresion returns inside the generator.
Only in case of co, those async callback things (and other things) are handled "appropriately" for what you would consider natural in an async control flow library.
What does done look like when yield is used?
The thread function example from the article that you read gives you a good impression of this:
function thread(fn) {
var gen = fn();
function next(err, res) {
var ret = gen.next(res);
if (ret.done) return;
ret.value(next);
}
next();
}
In your code, yield does yield the value of the expression read("file") from the generator when it is ran. This becomes the ret.val, the result of gen.next(). To this, the next function is passed - a callback that will continue the generator with the result that was passed to it. In your generator code, it looks as if the yield expression returned this value.
An "unrolled" version of what happens could be written like this:
function fn*() {
console.log( yield function (done) {
fs.readFile("filepath", "file", done);
} );
}
var gen = fn();
var ret1 = gen.next();
var callasync = ret1.value;
callasync(function next(err, res) {
var ret2 = gen.next(res); // this now does log the value
ret2.done; // true now
});
I posted a detailed explanation of how generators work here.
In a simplified form, your code might look like this without co (untested):
function workAsync(fileName)
{
// async logic
var worker = (function* () {
function read(path) {
return function (done) {
fs.readFile(path, "file", done);
}
}
console.log(yield read(fileName));
})();
// driver
function nextStep(err, result) {
try {
var item = err?
worker.throw(err):
worker.next(result);
if (item.done)
return;
item.value(nextStep);
}
catch(ex) {
console.log(ex.message);
return;
}
}
// first step
nextStep();
}
workAsync("file");
The driver part of workAsync asynchronously iterates through the generator object, by calling nextStep().

How to wrap async function calls into a sync function in Node.js or Javascript?

Suppose you maintain a library that exposes a function getData. Your users call it to get actual data:
var output = getData();
Under the hood data is saved in a file so you implemented getData using Node.js built-in fs.readFileSync. It's obvious both getData and fs.readFileSync are sync functions. One day you were told to switch the underlying data source to a repo such as MongoDB which can only be accessed asynchronously. You were also told to avoid pissing off your users, getData API cannot be changed to return merely a promise or demand a callback parameter. How do you meet both requirements?
Asynchronous function using callback/promise is the DNA of JavasSript and Node.js. Any non-trivial JS app is probably permeated with this coding style. But this practice can easily lead to so called callback pyramid of doom. Even worse, if any code in any caller in the call chain depends on the result of the async function, those code has to be wrapped in callback function as well, imposing a coding style constraint on caller. From time to time I find the need to encapsulate an async function (often provided in a 3rd party library) into a sync function in order to avoid massive global re-factoring. Searching for a solution on this subject usually ended up with Node Fibers or npm packages derived from it. But Fibers just cannot solve the problem I am facing. Even the example provided by Fibers' author illustrated the deficiency:
...
Fiber(function() {
console.log('wait... ' + new Date);
sleep(1000);
console.log('ok... ' + new Date);
}).run();
console.log('back in main');
Actual output:
wait... Fri Jan 21 2011 22:42:04 GMT+0900 (JST)
back in main
ok... Fri Jan 21 2011 22:42:05 GMT+0900 (JST)
If function Fiber really turns async function sleep into sync, the output should be:
wait... Fri Jan 21 2011 22:42:04 GMT+0900 (JST)
ok... Fri Jan 21 2011 22:42:05 GMT+0900 (JST)
back in main
I have created another simple example in JSFiddle and looking for code to yield expected output. I'll accept a solution that only works in Node.js so you are free to require any npm package despite not working in JSFiddle.
deasync turns async function into sync, implemented with a blocking mechanism by calling Node.js event loop at JavaScript layer. As a result, deasync only blocks subsequent code from running without blocking entire thread, nor incuring busy wait. With this module, here is the answer to the jsFiddle challenge:
function AnticipatedSyncFunction(){
var ret;
setTimeout(function(){
ret = "hello";
},3000);
while(ret === undefined) {
require('deasync').runLoopOnce();
}
return ret;
}
var output = AnticipatedSyncFunction();
//expected: output=hello (after waiting for 3 sec)
console.log("output="+output);
//actual: output=hello (after waiting for 3 sec)
(disclaimer: I am the co-author of deasync. The module was created after posting this question and found no workable proposal.)
You've got to use promises:
const asyncOperation = () => {
return new Promise((resolve, reject) => {
setTimeout(()=>{resolve("hi")}, 3000)
})
}
const asyncFunction = async () => {
return await asyncOperation();
}
const topDog = () => {
asyncFunction().then((res) => {
console.log(res);
});
}
I like arrow function definitions more. But any string of the form "() => {...}" could also be written as "function () {...}"
So topDog is not async despite calling an async function.
EDIT: I realize a lot of the times you need to wrap an async function inside a sync function is inside a controller. For those situations, here's a party trick:
const getDemSweetDataz = (req, res) => {
(async () => {
try{
res.status(200).json(
await asyncOperation()
);
}
catch(e){
res.status(500).json(serviceResponse); //or whatever
}
})() //So we defined and immediately called this async function.
}
Utilizing this with callbacks, you can do a wrap that doesn't use promises:
const asyncOperation = () => {
return new Promise((resolve, reject) => {
setTimeout(()=>{resolve("hi")}, 3000)
})
}
const asyncFunction = async (callback) => {
let res = await asyncOperation();
callback(res);
}
const topDog = () => {
let callback = (res) => {
console.log(res);
};
(async () => {
await asyncFunction(callback)
})()
}
By applying this trick to an EventEmitter, you can get the same results. Define the EventEmitter's listener where I've defined the callback, and emit the event where I called the callback.
If function Fiber really turns async function sleep into sync
Yes. Inside the fiber, the function waits before logging ok. Fibers do not make async functions synchronous, but allow to write synchronous-looking code that uses async functions and then will run asynchronously inside a Fiber.
From time to time I find the need to encapsulate an async function into a sync function in order to avoid massive global re-factoring.
You cannot. It is impossible to make asynchronous code synchronous. You will need to anticipate that in your global code, and write it in async style from the beginning. Whether you wrap the global code in a fiber, use promises, promise generators, or simple callbacks depends on your preferences.
My objective is to minimize impact on the caller when data acquisition method is changed from sync to async
Both promises and fibers can do that.
There is a npm sync module also. which is used for synchronize the process of executing the query.
When you want to run parallel queries in synchronous way then node restrict to do that because it never wait for response. and sync module is much perfect for that kind of solution.
Sample code
/*require sync module*/
var Sync = require('sync');
app.get('/',function(req,res,next){
story.find().exec(function(err,data){
var sync_function_data = find_user.sync(null, {name: "sanjeev"});
res.send({story:data,user:sync_function_data});
});
});
/*****sync function defined here *******/
function find_user(req_json, callback) {
process.nextTick(function () {
users.find(req_json,function (err,data)
{
if (!err) {
callback(null, data);
} else {
callback(null, err);
}
});
});
}
reference link: https://www.npmjs.com/package/sync
Nowadays this generator pattern can be a solution in many situations.
Here an example of sequential console prompts in nodejs using async readline.question function:
var main = (function* () {
// just import and initialize 'readline' in nodejs
var r = require('readline')
var rl = r.createInterface({input: process.stdin, output: process.stdout })
// magic here, the callback is the iterator.next
var answerA = yield rl.question('do you want this? ', r=>main.next(r))
// and again, in a sync fashion
var answerB = yield rl.question('are you sure? ', r=>main.next(r))
// readline boilerplate
rl.close()
console.log(answerA, answerB)
})() // <-- executed: iterator created from generator
main.next() // kick off the iterator,
// runs until the first 'yield', including rightmost code
// and waits until another main.next() happens
I can't find a scenario that cannot be solved using node-fibers. The example you provided using node-fibers behaves as expected. The key is to run all the relevant code inside a fiber, so you don't have to start a new fiber in random positions.
Lets see an example: Say you use some framework, which is the entry point of your application (you cannot modify this framework). This framework loads nodejs modules as plugins, and calls some methods on the plugins. Lets say this framework only accepts synchronous functions, and does not use fibers by itself.
There is a library that you want to use in one of your plugins, but this library is async, and you don't want to modify it either.
The main thread cannot be yielded when no fiber is running, but you still can create plugins using fibers! Just create a wrapper entry that starts the whole framework inside a fiber, so you can yield the execution from the plugins.
Downside: If the framework uses setTimeout or Promises internally, then it will escape the fiber context. This can be worked around by mocking setTimeout, Promise.then, and all event handlers.
So this is how you can yield a fiber until a Promise is resolved. This code takes an async (Promise returning) function and resumes the fiber when the promise is resolved:
framework-entry.js
console.log(require("./my-plugin").run());
async-lib.js
exports.getValueAsync = () => {
return new Promise(resolve => {
setTimeout(() => {
resolve("Async Value");
}, 100);
});
};
my-plugin.js
const Fiber = require("fibers");
function fiberWaitFor(promiseOrValue) {
var fiber = Fiber.current, error, value;
Promise.resolve(promiseOrValue).then(v => {
error = false;
value = v;
fiber.run();
}, e => {
error = true;
value = e;
fiber.run();
});
Fiber.yield();
if (error) {
throw value;
} else {
return value;
}
}
const asyncLib = require("./async-lib");
exports.run = () => {
return fiberWaitFor(asyncLib.getValueAsync());
};
my-entry.js
require("fibers")(() => {
require("./framework-entry");
}).run();
When you run node framework-entry.js it will throw an error: Error: yield() called with no fiber running. If you run node my-entry.js it works as expected.
You shouldn't be looking at what happens around the call that creates the fiber but rather at what happens inside the fiber. Once you are inside the fiber you can program in sync style. For example:
function f1() {
console.log('wait... ' + new Date);
sleep(1000);
console.log('ok... ' + new Date);
}
function f2() {
f1();
f1();
}
Fiber(function() {
f2();
}).run();
Inside the fiber you call f1, f2 and sleep as if they were sync.
In a typical web application, you will create the Fiber in your HTTP request dispatcher. Once you've done that you can write all your request handling logic in sync style, even if it calls async functions (fs, databases, etc.).
I struggled with this at first with node.js and async.js is the best library I have found to help you deal with this. If you want to write synchronous code with node, approach is this way.
var async = require('async');
console.log('in main');
doABunchOfThings(function() {
console.log('back in main');
});
function doABunchOfThings(fnCallback) {
async.series([
function(callback) {
console.log('step 1');
callback();
},
function(callback) {
setTimeout(callback, 1000);
},
function(callback) {
console.log('step 2');
callback();
},
function(callback) {
setTimeout(callback, 2000);
},
function(callback) {
console.log('step 3');
callback();
},
], function(err, results) {
console.log('done with things');
fnCallback();
});
}
this program will ALWAYS produce the following...
in main
step 1
step 2
step 3
done with things
back in main
Making Node.js code sync is essential in few aspects such as database. But actual advantage of Node.js lies in async code. As it is single thread non-blocking.
we can sync it using important functionality Fiber()
Use await() and defer ()
we call all methods using await(). then replace the callback functions with defer().
Normal Async code.This uses CallBack functions.
function add (var a, var b, function(err,res){
console.log(res);
});
function sub (var res2, var b, function(err,res1){
console.log(res);
});
function div (var res2, var b, function(err,res3){
console.log(res3);
});
Sync the above code using Fiber(), await() and defer()
fiber(function(){
var obj1 = await(function add(var a, var b,defer()));
var obj2 = await(function sub(var obj1, var b, defer()));
var obj3 = await(function sub(var obj2, var b, defer()));
});
I hope this will help. Thank You

Categories

Resources