Run async code from command line, node js - javascript

I have a function that generates some test data and inserts it to a mongodb:
'use strict';
const CvFaker = require('./cv-faker');
const mongoose = require('mongoose');
require('../models/cv_model.js');
module.exports.init = function(){
var cvfaker = new CvFaker();
cvfaker.genCvs(100);
mongoose.model('cv').create(cvfaker.cvs, (err, createdCvs) => {
if(err){
console.log('something went wrong');
}
})
};
I want to execute this code from the command line:
node -e 'require("./create-mock-db").init()'
The function executes, but it does not wait for the function to complete since it is async. How do I make it wait for the function to complete?
This is not working either:
module.exports.init = function(cb){ ...
..
cb();
node -e 'require("./create-mock-db").init(function(){})'

As this answer might come up for more people…
// test.js
const request = require('request');
const rp = require('request-promise');
const demo = module.exports.demo = async function() {
try {
const res = await rp.post( {
uri: 'https://httpbin.org/anything',
body: { hi: 'there', },
}, function (error, response, body) {
return error ? error : body;
} )
console.log( res )
return res;
}
catch ( e ) {
console.error( e );
}
};
Call it like this:
$ node -e 'require("./test").demo()'
Sidenote:
it does not wait for the function to complete since it is async
It's not async. You might call asynchronous functions, but you are not treating them as such and not awaiting any result.

The node process will not exit until the event queue is empty. The event loop uses the event queue to make asynchronous execution possible.
It's pretty simple to verify that this is not an issue with executing asynchronous code.
node -e "setTimeout(() => console.log('done'), 5000)"
This example takes 5 seconds to run, as you would expect.
The problem with your code is the fact that you never establish a connection with the database. The model.create method doesn't do anything until there is a connection, therefor nothing ever gets queued and the process is free to exit.
This means your code needs to change to do two things:
Connect to the database so that the model.create method can execute.
Disconnect from the database when model.create is complete so the process is free to exit.

To add to Kaiser's answer, if you are on Windows using cmd, the single/double quotes are important. Put the double quotes on the outside, i.e.
node -e "require('./test').demo()"

Related

Javascript/Node/JSON question, why is this not working?

I thought I understood what I was doing until this wasn't going in order. I am running this through Node, not through a browser.
It goes to the prompt at the end of the while loop first. I don't know why.
const fs = require('fs');
const prompt = require('prompt-sync')();
function jsonReader(filepath, cb){
fs.readFile(filepath, 'utf-8', (err, fileData) => {
if (err) { return cb && cb(err); }
try {
const object = JSON.parse(fileData);
console.log(object);
return cb && cb(null, object);
} catch (err) {
return cs && cb(err);
}
});
}
var exit = 0;
do {
jsonReader('./customer.json', (err, customer) => {
if (err) {
console.log('Error reading file:',err)
return
}
//customer.order_count +=1
const note = prompt("Enter a note for the JSON file: ");
customer.note = note;
fs.writeFile('./customer.json', JSON.stringify(customer, null, 2), (err) =>{
if (err) {
console.log('Error writing file:',err)
} else {
console.log('File updated');
}
})
})
exit = prompt("Do you want to exit?");
} while (exit != 'y');
There are several things wrong with your code. The prompt problem you notice is just the beginning.
The way your code executes is like this:
// happens now
do {
// happens now
// ...
jsonReader('./customer.json', (err, customer) => {
// happens after readFile
fs.writeFile('./customer.json', JSON.stringify(customer, null, 2), (err) =>{
// happens after writeFile
// ...
})
// happens after readFile
// ...
})
// happens now
exit = prompt("Do you want to exit?");
} while (exit != 'y');
// happens now
The time sequence is as follows:
1. Things that happens now
2. Things that happens after readFile
3. Things that happens after writeFile
It is obvious that you are outputting the prompt before either readFile or writeFile.
However there is another problem. You have an infinite while loop. In node.js and in fact in the browser I/O only happens when there is no javascript to execute - in other words it happens when the interpreter is idle. You are preventing the script from reaching the end of script with the while loop therefore the interpreter is never idle.
For your script what will happen after you fix the prompt issue is:
1. Things that happens now
2. loop into things that happens now
3. loop into things that happens now
4. loop into things that happens now
5. loop into things that happens now
..
∞. loop into things that happens now
Thus the readFile and writeFile never executes. You need to replace the while loop either with a recursive asynchronous call or using setTimeout() or setInterval() or use a while loop with async/await.
Here's an implementation with miniminal changes to your code:
function doIt () { // <----------- replace the do..while loop
// ...
jsonReader('./customer.json', (err, customer) => {
// ...
fs.writeFile('./customer.json', JSON.stringify(customer, null, 2), (err) =>{
// ...
var exit = prompt("Do you want to exit?");
if (exit !== 'y') {
doIt(); // <-------------- repeat the process again
}
})
})
}
doIt(); // <--------- don't forget to begin the whole thing
This is admittedly not as easy to read as using async/await but I leave that implementation as homework. Besides, it requires much more changes to your existing code.
It is because fs.readFile is asynchronous, so your second prompt is executing before fs.readFile has finished. You may want to use async functions and await or put the prompt at the end of the callback for jsonReader.

node async/await not working for me (when using Postgres / Node - working with DB updates before going to next call) [duplicate]

This question already has answers here:
Using async/await with a forEach loop
(33 answers)
Closed 3 years ago.
await is not blocking as expected, when a block of code updates db (using postgres / node )
https://node-postgres.com
I have a list of async function calls, each call udpates a database, and each subsequent call works on data updated by the previous call.
There are about eight calls in a row, and each call must update the complete set of data it is working with, 100% to completion, before going to the next.
I tried to make everything not async, but it appears I am forced to make everything async/await because of the library I am using (postgres / node).
Each function call must complete 100% before going on to the next function call, because the next step does a select on rows where a field is not null (where the previous step fills in a value).
I have an await in front of each call, that does something (see code below):
loads the db from a csv,
next step selects all rows just inserted, calls an API and updates the database,
and so on,
but at one point, when the next function executes, NONE of the rows have been updated (as I trace through and verify, a SQL statement returns nothing back),
the code seems to pass right through going to the second function call, not blocking, honoring the await, and completing it's code block.
If I comment out some of the latter rows (dependent on the previous), and let the program run to completion, the database gets updated.
There is nothing functionally wrong with the code, everything works, just not from beginning to completion.
After running two function calls at the beginning, letting that run, I can then comment out those rows, uncomment the later rows in the flow, and run again, and everything works as expected, but I cannot run to completion with both uncommented.
What can I do to make sure each function call completes 100%, has all updates completed in the database, before going to the next step?
async/await is not working for me.
this is not pseudo-code it's the actual code, that is executing, that I am working with, the function names changed only. It is real working code, cut-n-pasted direct from my IDE.
// these are functions I call below (each in their own .js)
const insert_rows_to_db_from_csv = require('./insert_rows_to_db_from_csv')
const call_api_using_rows_from_function_above = require('./call_api_using_rows_from_function_above')
const and_so_on = require('./and_so_on')
const and_so_on_and_on = require('./and_so_on_and_on')
const and_so_on_and_on_and_on = require('./and_so_on_and_on_and_on')
// each of the above exports a main() function where I can call func.main() just // like this one defined below (this is my main() entry point)
module.exports = {
main: async function (csvFilePath) {
console.log('service: upload.main()')
try {
const csvList = []
let rstream = fs.createReadStream(csvFilePath)
.pipe(csv())
.on('data', (data) => csvList.push(data))
.on('end', async () => {
let num_rows = csvList.length
//step one (if I run these two, with step two calls below commented out, this works)
await insert_rows_to_db_from_csv.main(csvList);
await call_api_using_rows_from_function_above.main();
// step two
// blows up here, on the next function call,
// no rows selected in sql statements, must comment out, let the above run to
// completion, then comment out the rows above, and let these run separate
await work_with_rows_updated_in_previous_call_above.main(); // sets
await and_so_on.main();
await and_so_on_and_on.main();
await and_so_on_and_on_and_on.main();
})
} catch (err) {
console.log(err.stack)
} finally {
}
}
};
here is the one liner I am using to call the insert/update to the DB:
return await pool.query(sql, values);
that's it, nothing more. This is from using:
https://node-postgres.com/
npm install pg
PART 2 - continuing on,
I think the problem might be here. This is where I am doing each
API call, then insert (that the next function call is dependent upon), some code smell here that I can't sort out.
processBatch(batch) is called, that calls the API, gets a response back, and then within there it calls `handleResponseDetail(response), where the insert is happening. I think the problem is here, if there are any ideas?
this is a code block inside:
await call_api_using_rows_from_function_above.main();
It completes with no errors, inserts rows, and commits, then the next function is called, and this next function finds no rows (inserted here). But the await on the entire main() .js blocks and waits, so I don't understand.
/**
* API call, and within call handleResponse which does the DB insert.
* #param batch
* #returns {Promise<*>}
*/
async function processBatch(batch) {
console.log('Processing batch');
return await client.send(batch).then(res => {
return handleResponseDetail(res);
}).catch(err => handleError(err));
}
// should this be async?
function handleResponseDetail(response) {
response.lookups.forEach(async function (lookup) {
if (typeof lookup.result[0] == "undefined") { // result[0] is Candidate #0
++lookup_fail;
console.log('No response from API for this address.')
} else {
++lookup_success;
const id = await insert(lookup);
}
});
}
Given the code block from your Part 2 edit, the problem is now clear: all of your insert()s are being scheduled outside of the blocking context of the rest of your async/await code! This is because of that .forEach, see this question for more details.
I've annotated your existing code to show the issue:
function handleResponseDetail(response) { //synchronous function
response.lookups.forEach(async function (lookup) { //asynchronous function
//these async functions all get scheduled simultaneously
//without waiting for the previous one to complete - that's why you can't use forEach like this
if (typeof lookup.result[0] == "undefined") { // result[0] is Candidate #0
++lookup_fail;
console.log('No response from API for this address.')
} else {
++lookup_success;
const id = await insert(lookup); //this ONLY blocks the inner async function, not the outer `handleResponseDetail`
}
});
}
Here is a fixed version of that function which should work as you expect:
async function handleResponseDetail(response) {
for(const lookup of response.lookups) {
if (typeof lookup.result[0] == "undefined") { // result[0] is Candidate #0
++lookup_fail;
console.log('No response from API for this address.')
} else {
++lookup_success;
const id = await insert(lookup); //blocks handleResponseDetail until done
}
}
}
Alternatively, if the order of insertion doesn't matter, you can use Promise.all for efficiency:
async function handleResponseDetail(response) {
await Promise.all(response.lookups.map(async lookup => {
if (typeof lookup.result[0] == "undefined") { // result[0] is Candidate #0
++lookup_fail;
console.log('No response from API for this address.')
} else {
++lookup_success;
const id = await insert(lookup);
}
})); //waits until all insertions have completed before returning
}
To reiterate, you cannot easily use .forEach() with async/await because .forEach() simply calls the given function for each element of the array synchronously, with no regard for awaiting each promise before calling the next. If you need the loop to block between each element, or to wait for all elements to complete processing before returning from the function (this is your use case), you need to use a different for loop or alternatively a Promise.all() as above.
What your main function currently does is merely creating stream, assigning listeners and instantly returning. It does not await for all the listeners to resolve like you are trying to have it do
You need to extract your file reading logic to another function, which will return a Promise that will resolve only when the entire file is read, then await for that Promise inside main
function getCsvList(csvFilePath) {
return new Promise((resolve, reject) => {
const csvList = []
fs.createReadStream(csvFilePath)
.pipe(csv())
.on('data', (data) => csvList.push(data))
.on('end', () => {
resolve(csvList)
})
.on('error', (e) => reject(e))
})
}
module.exports = {
main: async function (csvFilePath) {
try {
const csvList = await getCsvList(csvFilePath)
await insert_rows_to_db_from_csv.main(csvList);
await call_api_using_rows_from_function_above.main();
await work_with_rows_updated_in_previous_call_above.main();
await and_so_on.main();
await and_so_on_and_on.main();
await and_so_on_and_on_and_on.main();
} catch (err) {
console.log(err.stack)
} finally {
}
}
};

How to connect to mssql server synchronously in node.js

All of the examples for using the mssql client package/tedious driver are for async/callbacks/promises but I'm only developing a microservice that will see limited use and my understanding of asynchronous functions is still a bit fuzzy.
Here's what I have for trying to use async/await :
Report generation class:
const mssql = require('mssql');
const events = require('events');
class reporter {
constructor(searcher, logger) {
// Pass in search type and value or log the error of none defined
this.lg = logger
if (searcher.type && searcher.content) {
this.lg.lg("reporter created", 3)
this.srchType = searcher.type;
this.srchContent = searcher.content;
} else {
this.lg.lg("!MISSING SEARCH PARAMETERS", 0);
this.err = "!MISSING SEARCH PARAMETERS";
}
}
proc() {
//DB Connect async
async () => {
try {
await mssql.connect('mssql://username:password#localhost/database')
this.result = await mssql.query`select * from mytable where id = ${this.searcher}`
} catch (err) {
// ... error checks
}
}
return this.result;
}
}
Then called:
//Pass to reporter for resolution
var report1 = new reporter(searcher, logs);
report1.proc();
I'm sure this is probably a pretty bad way to accomplish this, so I'm also open to any input on good ways to accomplish the end goal, but I'd still like to know if it's possible to accomplish synchronously.
You can't do it synchronously. Figuring out this async stuff is definitely worth your time and effort.
async / await / promises let you more-or-less fake doing it synchronously
const report1 = new reporter(searcher, logs);
report1.proc()
.then ( result => {
/* in this function, "result" is what your async function returned */
/* do res.send() here if you're in express */
} )
.catch ( error => {
/* your lookup failed */
/* inform the client of your web service about the failure
* in an appropriate way. */
} )
And, unwrap the async function in your proc function, like so:
async proc() {
try {
await mssql.connect('mssql://username:password#localhost/database')
this.result = await mssql.query`select * from mytable where id = ${this.searcher}`
} catch (err) {
// ... error checks
}
return this.result;
}
await and .then are analogous.
Kind of an updated answer that continues off of O. Jones' answer.
The current version of Node.js (v15+) has support for top-level await, meaning you can run it all sequentially.
import mssql from 'mssql';
await mssql.connect('mssql://username:password#localhost/database')
const result = await mssql.query`select * from mytable where id = ${this.searcher}`
But it should still be avoided since you want to catch for errors instead of letting it crash.
In current versions of Node.js, if an await/promise rejects, and isn't caught with a .catch(), then the uncaught promise will terminate your application with the error

Wait for event in node js

How can I wait for an event in node js? I'm developing a bpmn workflow and I have to execute the event step by step.The server is compounded by several scripts and each script is an event, like this:
'use strict';
const Bpmn = require('bpmn-engine');
const processXml = `
<?xml version="1.0" encoding="UTF-8"?>
<definitions xmlns="http://www.omg.org/spec/BPMN/20100524/MODEL"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<process id="theProcess" isExecutable="true">
<startEvent id="start" />
<exclusiveGateway id="decision" />
<endEvent id="RFID_ERRATO" />
<endEvent id="RFID=M1" />
<sequenceFlow id="flow1" sourceRef="start" targetRef="decision" />
<sequenceFlow id="flow2" sourceRef="decision" targetRef="RFID_ERRATO">
<conditionExpression xsi:type="tFormalExpression"
language="JavaScript"><![CDATA[
this.variables.input != "M1"
]]></conditionExpression>
</sequenceFlow>
<sequenceFlow id="flow3" sourceRef="decision" targetRef="RFID=M1">
<conditionExpression xsi:type="tFormalExpression"
language="JavaScript"><![CDATA[
this.variables.input = "M1"
]]></conditionExpression>
</sequenceFlow>
</process>
</definitions>`;
const engine = new Bpmn.Engine({
name: 'exclusive gateway example1',
source: processXml
});
engine.once('end', (definition) => {
if (definition.getChildActivityById('RFID_ERRATO').taken) throw new
Error('<RFID_ERRATO> was not supposed to be taken, check your input');
console.log('TAKEN RFID=M1',
definition.getChildActivityById('RFID=M1').taken);
});
function sendEvent(value){
engine.execute({
variables: {
input: value
}
}, (err, definition) => {
console.log(engine.getState())
});
}
var i = 0;
//hello.js
module.exports = (req, res, next) => {
//res.header('X-Hello', 'World')
//console.log(req);
if(!i++){
sendEvent(req.body.rfid);
}
console.log(engine.getState())
next()
}
(I'm using these modules https://www.npmjs.com/package/bpmn-engine https://www.npmjs.com/package/json-server). The server is started writing on the command line "json-server db.json --middlewares ./script1.js ./script2.js" and then I call the request post sending the data over the server, only one time. The problem is that all the events reply at the only first request sequentially. I want that the first script/event reply at the first request while the second event is waiting,and when the second request is sent,the following script perform it,and so on. It is possible?
To wait and then do something, you need to run the code in an asynchronous way, there's a lot of good approaches for that.
The most common is the promise, a promise will gets the return or a error from an asynchronous code. Basic example(from Mozilla Developers):
let myFirstPromise = new Promise((resolve, reject) => {
// We call resolve(...) when what we were doing asynchronously was successful, and reject(...) when it failed.
// In this example, we use setTimeout(...) to simulate async code.
// In reality, you will probably be using something like XHR or an HTML5 API.
setTimeout(function(){
resolve("Success!"); // Yay! Everything went well!
}, 250);
});
myFirstPromise.then((successMessage) => {
// successMessage is whatever we passed in the resolve(...) function above.
// It doesn't have to be a string, but if it is only a succeed message, it probably will be.
console.log("Yay! " + successMessage);
});
The "thing" in asynchronous is that we'll do something and then we'll do something, this then is doing what we need and don't have in a sync code.
There's a lot of npm packages that can help us to do that too, like async-waterfall that will run the functions in series, example from their github:
/* basic - no arguments */
waterfall(myArray.map(function (arrayItem) {
return function (nextCallback) {
// same execution for each item, call the next one when done
doAsyncThingsWith(arrayItem, nextCallback);
}}));
/* with arguments, initializer function, and final callback */
waterfall([function initializer (firstMapFunction) {
firstMapFunction(null, initialValue);
}].concat(myArray.map(function (arrayItem) {
return function (lastItemResult, nextCallback) {
// same execution for each item in the array
var itemResult = doThingsWith(arrayItem, lastItemResult);
// results carried along from each to the next
nextCallback(null, itemResult);
}})), function (err, finalResult) {
// final callback
});
It will run an Array.map of functions in series, avoiding a good enemy when we work with async codes, the callback hell.
So async code will let you wait for an event cause it let's you do something and then do another thing with the results.

Nodejs delay return for "require"

My setup is as follows:
Nodejs Server
server.js requires utils.js
utils.js loads data from mongodb into memory and exports it
server.js uses a variable that utils.js exports
The issue that I am worried about is the fact that the mongodb call is asynchronous. utils.js returns before the mongodb call is finished, meaning that server.js will use an undefined variable when it continues execution after the require.
What is the best to address this issue? The only thing I could think of is wrapping my server.js code in a giant callback and pass that to the function that makes the mongodb call. It seems a bit messy to me, is there a better way to do it?
Code:
server.js
var utils = require("./modules/utils.js");
console.log(utils);
//Do whatever
utils.js
var mods = [];
var db = require("mongojs").connect("localhost", ["modules"]);
db.modules.find({}, function(err, modules){
mods = modules;
});
module.exports = mods;
What you're referring to is called "callback hell". The easiest way to get out of that is to use a Promise library that simplifies it.
I used a node package called bluebird.
var mysql = require("mysql");
var hash = require("password-hash");
var Promise = require("bluebird");
var settings = require("../settings");
Promise.promisifyAll(require("mysql/lib/Connection").prototype);
Promise.promisifyAll(require("mysql/lib/Pool").prototype);
var db_config = {
user:settings.db.user,
password:settings.db.password,
database:settings.db.database
};
var con = mysql.createPool(db_config);
function query(sql) {
return con.getConnectionAsync().then(function(connection) {
return connection.queryAsync(sql)
.spread(function(rows,fields) {
return rows;
}).finally(function() {
connection.release();
});
});
}
This is a very basic database module I wrote that uses bluebird to promisify the database object.
And here's how it's used. It returns a promise! The benefit here is that not only does it return the clutter of callback hell, it makes sure that your code runs asynchronously and the function does not return before things have stopped processing, like in this case, a database query.
function login(user) {
//check for player existance
var query = 'SELECT p.name,p.password,p.id, pd.x, pd.y FROM player p INNER JOIN player_data pd ON p.id = pd.id WHERE p.name='+mysql.escape(user);
return db.select(query).then(function(rows) {
if (!rows.length) return;
return [
rows[0]
];
});
}
Notice how you return a promise, so that you call the then or spread method to get those database values you just queried, not having to worry about if rows will be undefined by the time you want to use it.
As you say, you need to wrap the entire server in a callback. Node.js works this way, it's asynchronous by nature. A server needs to pass through 3 stages: init, serve and deinit. In your case, that database code goes inside the init stage. You could write something like this.
//server.js
var utils = require ("./modules/utils");
var init = function (cb){
//init the utils module, the http server and whatever you need
utils.init (function (error){
if (error) return handleError (error);
cb ();
});
};
var serve = function (){
//Start listening to the http requests
};
var deinit = function (cb){
//This is typically executed when a SIGINT is received, see link1
};
init (serve);
//utils.js
//You could write a wrapper for the db instance, see link2
var mongodb = require ("mongojs");
var db;
module.exports.init = function (cb){
db = mongodb.connect ("localhost", ["modules"]);
db.modules.find ({}, function (err, modules){
if (err) return cb (err);
cb (null, modules);
});
};
I don't recommend using promises, they are slower than raw callbacks. You don't need them at all.
link1 - SIGINT
link2 - db wrapper

Categories

Resources