How to pass process dir in runtime? - javascript

I have problem to pass process dir. i want to spawn mongod process. application build with pkg. i tested to access without spawn section. like console.log(args[1]). it return dir process. after i uncomment that it said line 4
ReferenceError: Cannot access 'process' before initialization
const { spawn } = require('child_process');
const { parse } = require('path')
const processPath = parse(process.argv[0]);
const processDir = processPath.dir;
const executableName = 'mongod';
const args = [
'-f', `${__dirname}/configs/mongodb.yml`,
'--dbpath', `${processDir}/database/data`,
'--logpath', `${processDir}/database/log/system.log`,
];
const options = {
cwd: `${processDir}/bin`
};
const process = spawn(executableName, args, options);
process.stdout.on('data', chunk => {
console.log(chunk.toString())
})
my build dir, inside bin theres mongod executable.
build
build/program.exe
build/bin
build/database
build/database/data
build/database/log/system.log
i seperate assets that not include into real device filesystem. the rest inside snapshot filesystem (pkg virtual filesystem)

Related

Node.js / Pino.js: How to rotate logs in a separate thread

I am trying to use pino for logging in to my node app Server and I do have some large logs coming, so rotating the files every day would be more practical.
So I used pino.multistream() and require('file-stream-rotator')
My code works, but for performance reasons, I would not like to use the streams in the main thread.
according to the doc, I should use pino.transport():
[pino.multistream()] differs from pino.transport() as all the streams will be executed within the main thread, i.e. the one that created the pino instance.
https://github.com/pinojs/pino/releases?page=2
However, I can't manage to combine pino.transport() and file-stream-rotator.
my code that does not work completely
-> logs the first entries, but is not exportable because it blocks the script with the error
throw new Error('the worker has exited')
Main file
const pino = require('pino')
const transport = pino.transport({
target: './custom-transport.js'
})
const logger = pino(transport)
logger.level = 'info'
logger.info('Pino: Start Service Logging...')
module.exports = {
logger
}
custom-transport.js file
const { once } = require('events')
const fileStreamRotator = require('file-stream-rotator')
const customTransport = async () => {
const stream = fileStreamRotator.getStream({ filename: 'myfolder/custom-logger.log', frequency: 'daily' })
await once(stream, 'open')
return stream
}
module.exports = customTransport

Node JS can't find module

I have a files.js file and am requiring it at the start of my code.
When I use:
const files = require('./lib/files');
I get this error:
Error: Cannot find module './lib/files'
However if I test it with another one of the files in the same folder like:
const files = require('./lib/repo');
It runs.
files.js:
const fs = require('fs');
const path = require('path');
module.exports = {
getCurrentDirectoryBase: () => {
return path.basename(process.cwd());
},
directoryExists: (filePath) => {
return fs.existsSync(filePath);
}
};
I would use tree command however I have too many node modules installed to show it correctly so:
const { getCurrentDirectoryBase, directoryExists } = require('./lib/files')

Best way to share variables in node processes

Suppose I have 2 two processes like these:
file1.js
let variable1 = "variable1"
file2.js
let variable2 = "variable2"
that have both been spawned using
node file1.js
node file2.js
Is there a way to let them communicate? For example can I get variable1's value from file2.js?
If you create a master node js and fork 2 child node js process than you can communicate data between parent and child.
Basic example:
const fork = require('child_process').fork;
const program = path.resolve('child.js');
const parameters = [];
const options = {
stdio: [ 'pipe', 'pipe', 'pipe', 'ipc' ]
};
const child = fork(program, parameters, options);
child.on('message', message => {
console.log('message from child:', message);
child.send('Hi');
});
More:
https://nodejs.org/api/child_process.html#child_process_child_process_fork_modulepath_args_options

How to use electron's app.getPath() to store data?

I want to store images on the users computer, so I figure it should be stored in users data folder, as described here.
app.getPath(name)
name. Returns String - A path to a special directory or file associated with name. On failure an Error is thrown. You can request the following paths by the name:
home User's home directory
appData Per-user application data directory, which by default points to:
%APPDATA% on Windows
$XDG_CONFIG_HOME or ~/.config on Linux
~/Library/Application Support on macOS
userData The directory for storing your app's configuration files, which by default it is the appData directory appended with your app's name.
...
This is what I think you're supposed to do:
const app = require('electron');
alert(app.getPath('userData'));
But I get "getPath is not a function". I am not sure where to put it. It does not work from my html file or the renderer file, and I'm not sure how to use it from the main file because that's not linked to the web page.
Since the remote method is being considered deprecated, as shown here, I'd suggest you do this:
const {app} = require('electron');
console.log(app.getPath('userData'));
remote is considered dangerous.
app.getPath will be always available in main process.
Here is how to do it in renderer process without using remote (electron 7+)
Basically you have to use invoke in renderer.
in main
ipcMain.handle('read-user-data', async (event, fileName) => {
const path = electron.app.getPath('userData');
const buf = await fs.promises.readFile(`${path}/${fileName}`));
return buf;
})
in renderer
ipcRenderer.invoke('read-user-data', 'fileName.txt').then(
result => doSomething()
);
Here is what I use when I need to switch between dev and release
const electron = require('electron');
export const userDataPath = (electron.app || electron.remote.app).getPath(
'userData'
);
Another way to prevent the error "getPath is not a function" is to make the code work both in the renderer process and the main process:
const electron = require('electron');
const configDir = (electron.app || electron.remote.app).getPath('userData');
console.log(configDir);
I had trouble with app.getPath('userData') to save/load config files, etc and ended up using OS specific env vars in the meantime:
const getAppBasePath = () => {
//dev
if (process.env.RUN_ENV === 'development') return './'
if (!process.platform || !['win32', 'darwin'].includes(process.platform)) {
console.error(`Unsupported OS: ${process.platform}`)
return './'
}
//prod
if (process.platform === 'darwin') {
console.log('Mac OS detected')
return `/Users/${process.env.USER}/Library/Application\ Support/${YOUR_APP_NAME}/`
} else if (process.platform === 'win32') {
console.log('Windows OS detected')
return `${process.env.LOCALAPPDATA}\\${YOUR_APP_NAME}\\`
}
}
if you wanna do it in renderer process
try this,it is work for me
// main.js
const electron = require('electron')
const electronRemote = process.type === 'browser' ? electron :
require('#electron/remote')
const { app, ipcMain, Menu, globalShortcut } = require('electron')
const BrowserWindow = electronRemote.BrowserWindow
const isDev= require('electron-is-dev')
const { initialize, enable } = require('#electron/remote/main')
initialize()
let mainWindow
app.on('ready', ()=>{
mainWindow = new BrowserWindow({
width: 1024,
height: 600,
minWidth:600,
webPreferences: {
nodeIntegration: true,
enableRemoteModule: true,
contextIsolation: false
}
})
enable(mainWindow.webContents)
})
render process
// render process
const { app } = window.require('#electron/remote')
const savedPath = app.getPath('userData')

gulp-install not install the dependencies with vinyl file

I'm a beginner of gulp plugin. I want install dependencies by merged package.json. The code as follow.
gulp.task('install-dependencies', function () {
var through = require('through2');
var npm = require('npm');
var Promise = require('bluebird');
var file = require('gulp-file');
var install = require('gulp-install');
var path = require('path');
var npmLoad = Promise.promisify(npm.load);
var plugins = {};
//for test
var lastFile;
gulp.src([`${PATH.plugin}**/package.json`])
.pipe(through.obj(function parse_plugins(file, enc, cb) {
if (file.isNull()) {
cb();
return;
}
if (file.isStream()) {
this.emit('error', new gulpUtil.PluginError('package-concat', 'Streaming not supported'));
cb();
return;
}
let fileContent = {};
try {
fileContent = JSON.parse(file.contents.toString())
} catch (e) {
this.emit('error', new gulpUtil.PluginError('package-concat', `file '${file.path}' not a json file!`));
throw e;
}
plugins = Object.assign({}, plugins, fileContent.dependencies)
lastFile = file;
cb();
},
function install(cb){
let fileContent = {};
fileContent.name = "test";
fileContent.dependencies = plugins;
var file = new gulpUtil.File({
base: path.join(__dirname, './'),
cwd: __dirname,
path: path.join(__dirname, './package.json'),
contents: new Buffer(JSON.stringify(fileContent))
});
this.push(file);
cb();
}
))
.pipe(install());
})
But, The dependencies not install as expected. And the log as follow.
[14:50:37] Starting 'install-dependencies'...
[14:50:37] Finished 'install-dependencies' after 205 ms
npm WARN optional Skipping failed optional dependency /chokidar/fsevents:
npm WARN notsup Not compatible with your operating system or architecture: fsevents#1.0.11
What is your operating system ?
You might find something similar here.
Sounds like you can try to uninstall everything and start again from scratch (or just delete your node_module folder and use npm install). Not sure though, depends a lot on your operating system according to the error itself.

Categories

Resources