im trying to make a command that connects to the database, i created a little CLI script that loops through files in specific folders to get command class modules
my problem is that in one of my commands, i'm trying to connect to sequelize, and it just doesn't seem to be doing anything. i get no output to the console, nor does it even seem to try to connect
this is probably because i'm still kind of struggling to figure out how to properly do sync / async / await stuff...notice how i use glob.sync cause i want to loop through the files sychronously, but then in my command i need to connect to the database using await
cli.js:
#! /usr/bin/env node
const patterns = [
'./node_modules/unity/src/commands/**/*.js',
'./src/commands/**/*.js',
]
const glob = require('glob')
const path = require('path')
const yargs = require('yargs')
const signature = yargs.argv._[0]
const process = require('process')
patterns.forEach(pattern => {
glob.sync(pattern).forEach(file => {
const commandPath = path.resolve(file)
const command = require(commandPath)
if (command.signature == signature) {
command.argv = yargs.argv
command.run()
process.exit()
}
})
})
console.log('Command not found.')
here is an example command in one of the commands folders:
const { Sequelize } = require('sequelize')
class MigrateCommand {
static signature = 'migrate'
static argv = {}
static run() {
const sequelize = new Sequelize({
dialect: 'mysql',
host: 'localhost',
port: 3306,
database: 'dio_unity2',
username: 'root',
password: '',
})
const connect = async () => {
try {
await sequelize.authenticate()
console.log('Connection successful.')
}
catch (error) {
console.error('Unable to connect.', error)
}
}
connect()
console.log('migrate run complete')
}
}
module.exports = MigrateCommand
i've set npx up so i can just run npx unity migrate and it will call this command based on the migrate signature
now my console should say connection successful or unable to connect and then migrate run complete, but all i see in the console is migrate run complete. it's like it isn't even trying to connect at all...
i have no idea what i'm doing wrong here.
You can't make an asynchronous process synchronous without spawning a new thread and synchronously waiting on it. (There's an npm package that does that, via execSync.)
But the good news here is there's no need to in your code. You want to do things in series, but doing things in series isn't quite the same as doing them synchronously. Here's how:
First, run can't make the async process it's starting via authenticate synchronous. So instead, run should just be async (and we don't need connect):
static async run() {
const sequelize = new Sequelize({
dialect: 'mysql',
host: 'localhost',
port: 3306,
database: 'dio_unity2',
username: 'root',
password: '',
})
try {
await sequelize.authenticate()
console.log('Connection successful.')
} catch (error) {
console.error('Unable to connect.', error)
}
console.log('migrate run complete')
}
Next, in the script, we loop through your patterns and files using an async function:
(async () => {
for (const pattern of patterns) {
for (const file of glob.sync(pattern)) {
const commandPath = path.resolve(file)
const command = require(commandPath)
if (command.signature == signature) {
command.argv = yargs.argv
await command.run() // *** Note the `await`
process.exit()
}
}
}
console.log("Command not found.")
})();
That does the work asynchronously, but one after another (in series).
If it were me, I wouldn't use process.exit() to terminate the process, it's a very aggressive form of process termination which may prevent some things from finishing (details here), and may not be as clear to people doing code maintenance later as alternatives.. Now that we've put the code in a wrapper function, I'd just return out of the function to break the loops:
await command.run();
return;
Related
I am planning to make a select tag. It should show a list of files available in ftp server.
I am using basic-ftp npm package for this purpose and I am able to use it in the .js file but I am not sure how I have to use it in the oneditprepare function so that I can get the list of files available in the server.
My oneditprepare function is:
oneditprepare: async function () {
example();
ftpList = [];
ftpListNames = [];
async function example() {
const ftp = require("basic-ftp");
const client = new ftp.Client();
client.ftp.verbose = true;
try {
await client.access({
host: "192.168.104.105",
user: "testuser",
password: "1234",
});
// console.log(await client.list("/files"));
ftpList = await client.list("/files");
for (let i = 0; i < ftpList.length; i++) {
ftpListNames.push(ftpList[i].name);
}
console.log("ftpListNames :>> ", ftpListNames);
} catch (err) {
console.log(err);
}
client.close();
}
}
But using this gives me an error saying require is not defined.
So I made a change to my settings.js file of node-red and added the package but still could not find a solution.
Short answer, you don't
Longer answer:
You can not make FTP connections from the oneditprepare function because that runs in the browser. So you need to add all that code to the .js file and implement HTTP REST endpoints that can be from within the oneditprepare function to trigger it running in the Node-RED backend.
I am trying to use pino for logging in to my node app Server and I do have some large logs coming, so rotating the files every day would be more practical.
So I used pino.multistream() and require('file-stream-rotator')
My code works, but for performance reasons, I would not like to use the streams in the main thread.
according to the doc, I should use pino.transport():
[pino.multistream()] differs from pino.transport() as all the streams will be executed within the main thread, i.e. the one that created the pino instance.
https://github.com/pinojs/pino/releases?page=2
However, I can't manage to combine pino.transport() and file-stream-rotator.
my code that does not work completely
-> logs the first entries, but is not exportable because it blocks the script with the error
throw new Error('the worker has exited')
Main file
const pino = require('pino')
const transport = pino.transport({
target: './custom-transport.js'
})
const logger = pino(transport)
logger.level = 'info'
logger.info('Pino: Start Service Logging...')
module.exports = {
logger
}
custom-transport.js file
const { once } = require('events')
const fileStreamRotator = require('file-stream-rotator')
const customTransport = async () => {
const stream = fileStreamRotator.getStream({ filename: 'myfolder/custom-logger.log', frequency: 'daily' })
await once(stream, 'open')
return stream
}
module.exports = customTransport
I am trying to export database properties stored in properties file from Javascript module. By the time I read database properties file, Javascript file is already exported and data properties appear undefined wherever I use in other modules.
const Pool = require('pg').Pool;
const fs = require('fs')
const path = require('path');
class DbConfig {
constructor(dbData) {
this.pool = new Pool({
user: dbData['user'],
host: dbData['host'],
database: dbData['database'],
password: dbData['password'],
max: 20,
port: 5432
});
}
}
function getdbconf() {
const dbData = {};
fs.readFile("../../db_properties.txt"), 'utf8', (err, data) => {
if (err) {
console.error(err)
return
}
// dbData = {"user":"postgres", "password": "1234"...};
return dbData;
});
}
let db = new DbConfig(getdbconf());
let dbPool = db.pool;
console.log("dbpool : -> : ",dbPool); // username and password appear undefined
module.exports = { dbPool };
Is there a way to read data before exporting data from Javascript module?
Usually database config or any other sensitive info is read from a .env file using dotenv .
Or
you could also provide env from command line itself like
DB_HOST=127.0.0.1 node index.js
inside your index.js
console.log(process.env.DB_HOST)
Please create a new file (connection-pool.js) and paste this code:
const { Pool } = require('pg');
const poolConnection = new Pool({
user: 'postgresUserName',
host: 'yourHost',
database: 'someNameDataBase',
password: 'postgresUserPassword',
port: 5432,
});
console.log('connectionOptions', poolConnection.options);
module.exports = poolConnection;
For use it, create a new file (demo-connection.js) and paste this code:
const pool = require('./connection-pool');
pool.query('SELECT NOW();', (err, res) => {
if (err) {
// throw err;
console.log('connection error');
return;
}
if (res) {
console.log(res.rows);
pool.end();
}
});
This is an alternative option 🙂
Exporting the result of async calls
To export values which have been obtained asynchronously, export a Promise.
const fs = require('fs/promises'); // `/promise` means no callbacks, Promise returned
const dbDataPromise = fs.readFile('fileToRead')); //`readFile` returns Promise now
module.exports = dbDataPromise;
Importing
When you need to use the value,
const dbDataPromise = require('./dbdata');
async init() {
const dbData = await dbDataPromise;
}
//or without async, using Promise callbacks
init() {
dbDataPromise
.then(dbData => the rest of your code that depends on dbData here);
}
Current code broken
Please note that your current code, as pasted above, is broken:
function getdbconf() {
const dbData = {};
fs.readFile("../../db_properties.txt"), 'utf8', (err, data) => {
//[...] snipped for brevity
return dbData;
});
}
fs.readFile "returns" dbData, but there is nothing to return to, since you are in a callback which you did not call yourself. Function getdbconf returns nothing.
The line that says let db = new DbConfig(getdbconf()); will NOT work. It needs to be inside the callback.
The only way to avoid putting all of your code inside the callback (and "flatten" it) is to use await, or to use readFileSync
Avoiding the issue
Using environment variables
Suhas Nama's suggestion is a good one, and is common practice. Try putting the values you need in environment variables.
Using synchronous readFile
While using synchronous calls does block the event loop, it's ok to do during initialization, before your app is up and running.
This avoids the problem of having everything in a callback or having to export Promises, and is often the best solution.
I could do this with bash but I'm trying to learn node and would like to do it from there. How do I get the newman run call to be synchronous. I don't really understand the use of async/await (if that is what is required here). I have the following script that loops over a bunch of collection files (that each contain multiple requests) and calls newman run on each of them:
// node imports
const fs = require('fs');
const newman = require('newman');
// test variables
const testFolder = './api-tests/';
// read all files in the test folder
fs.readdirSync(testFolder).forEach(file => {
console.log('Running file: ' + file);
// run newman using the file
newman.run({
collection: require(testFolder + file),
delayRequest: 500,
iterationData: [
{
'host': 'localhost',
'port': '8080'
}
],
reporters: ['cli', 'html']
}, (err, summary) => {
if (err) {
throw err;
}
console.log(file + ' run complete');
});
});
Newman executes each file immediately rather than waiting for the loop to go back around to the next file.
Thanks.
you can use deasync https://github.com/abbr/deasync
var done = false;
fs.readdirSync(testFolder).forEach(file => {
newman.run({
...
}).on('start', function (err, args) { // on start of run, log to console
console.log('running a collection...');
}).on('done', function (err, summary) {
...
done = true;
});
require('deasync').loopWhile(function(){return !done;});
done = false;
}
I'm working on a video/music streaming application in Next/React.JS, fetching data from a Wordpress API/Backend/headless CMS. It's working great on localhost (though it's real barebones functionality at the moment) - however, when I attempt to export it to create a static front end the export is failing repeatedly with the (ridiculously common and usually straightforward) 'cannot read property of undefined' error.
I've spent the last 12 hours debugging rigorously and scanning here/GH etc, using all combinations of then()/catch() await etc under the sun, but I just can't get it to work and can't for the life of me figure out why. I'm aware 'title' is undefined because the data hasn't been fetched at the time of exporting but how to get this past'next export'?
Here's my getInitialProps from single.js - where the problem seems to be, getting no other relevant errors in the console - I'll post the terminal message upon attempting to export below. This is where I've come back to after dozens of versions - it's been a long day, so there may be a silly mistake or two, but this is functioning locally, without any errors.
static async getInitialProps(context) {
const slug = context.query.slug;
let post = {};
// Make request for posts.
try {
const response = await axios.get(
`http://timeline-music-30.local/wp-json/wp/v2/posts?slug=${slug}`
);
post = response.data[0];
} catch (err) {
console.error(err);
}
return { post };
// Return our only item in array from response to posts object in props.
console.log("post:", post);
}
I expect the application to export to a static site successfully, but it fails with the following terminal message:
copying "static build" directory
launching 3 threads with concurrency of 10 per thread
[==--] 2/4 50% 118/s 0.0s TypeError: Cannot read property 'title' of undefined
at _default.render (C:\Users\Terry\Documents\github\projects\timeline-music-3.0\nextjs\.next\server\static\YyI9s0TjENSVhc1SFZUcV\pages\single.js:158:35)
Any ideas/help would be greatly appreciated.
Thanks
Terry
First, you can't console.log after return.
Second. Use isomorphic-fetch and this construction, i think that in your case help this:
static async getInitialProps(context) {
const slug = context.query.slug;
// Make request for posts.
const resPost = await fetch('http://timeline-music-30.local/wp-json/wp/v2/posts?slug=${slug}');
const dataPost = await resPost.json();
console.log("post:", dataPost.data[0]);
return { post: dataPost.data[0] };
}
In component use {this.props.post}.
If this not helped, look at my case, it's working in local and production:
In my similar case, I solved about the same problem:
I solved this problem on the site http://computers.remolet.ru/. The task was to produce different content depending on the domain, so the pages request content from the API via fetch. Here is how I solved this problem:
Add to module top:
import getConfig from 'next/config';
const nextConfig = getConfig();
// npm i isomorphic-fetch need
import 'isomorphic-fetch';
Fetching on page:
static async getInitialProps ({ ctx }) {
var host = '';
if (nextConfig && nextConfig.publicRuntimeConfig && nextConfig.publicRuntimeConfig.HOST) {
// server side
host = nextConfig.publicRuntimeConfig.HOST;
} else if (ctx && ctx.req && ctx.req.headers) {
// front side
host = 'http://' + ctx.req.headers.host;
} else {
// front side
host = 'http://' + window.location.host;
}
const resPricelist = await fetch(host + '/api/pricelist');
const dataPricelist = await resPricelist.json();
const resNewPricelist = await fetch(host + '/api/newpricelist');
const dataNewPricelist = await resNewPricelist.json();
const resContent = await fetch(host + '/api/content');
const dataContent = await resContent.json();
return {
pricelistData: dataPricelist,
newPricelistData: dataNewPricelist,
contentData: dataContent
};
}
Using in component:
<Header as="h1" align="center">
{this.props.contentData.h1}
</Header>
In next.config.js:
module.exports = withCSS(withSass({
cssModules: true,
serverRuntimeConfig: {
PORT: process.env.PORT, // eslint-disable-line no-process-env
HOST: process.env.HOST, // eslint-disable-line no-process-env
CONTENT: process.env.CONTENT // eslint-disable-line no-process-env
},
publicRuntimeConfig: {
PORT: process.env.PORT, // eslint-disable-line no-process-env
HOST: process.env.HOST, // eslint-disable-line no-process-env
CONTENT: process.env.CONTENT // eslint-disable-line no-process-env
}
}));
Starting Node with environment:
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1",
"dev": "HOST='http://localhost:8000' PORT='8000' CONTENT='./db/computers.json' PRICELIST='./db/pricelist/computers.json' PRICELIST2='./db/pricelist/newComputers.json' node server.js",
"build": "next build",
"start": "next start"
},
This working on localhost and server.
Just use isomorphic-fetch and if you fetch from absolute url all you need is construction:
const resPricelist = await fetch(host + '/api/pricelist');
const dataPricelist = await resPricelist.json();
return {
data: dataPricelist
}
This is the result of approximately 20 hours of trying and reading the Next.js forums.
Hope i help you :)
P.S. Don't forget what you can use getInitialProps ({ ctx }) ONLY on page component, not in child components!