I want to run a js file (as if it were a frontend) in nodejs.
index.js:
const a = 'dk3';
console.log(a)
logs.js:
const logs = // exporting the file output above
console.log(logs)
the terminal of logs.js: dk3
how to do something like that? the index file cannot have anything to do with the logs, the log has to execute the index and return the output of the index as if it were a variable
edit: I got a part through this code:
index.js:
console.log('foi')
logs.js:
const child_process = require('child_process');
child_process.exec('node src/console.js', (err, stdout, stderr) => {
console.log(`log: ${stdout}`);
})
output of logs.js: foi
but the index runs as nodejs, and I wanted it to be frontend, as if it were running in the html
My assumption is you are asking how to run index.js as a child process of the node process running logs.js, and that you want the code in logs.js to capture the stdout of the index.js child process, and then output the output of the child process to its own stdout.
You can synchronously execute a child process with child_process.spawnSync() (Node v0.11.12+).
You can use this for logs.js:
const child_process = require('child_process');
const child = child_process.spawnSync('node', ['index.js'], { encoding : 'utf8' });
console.log(child.stdout);
Related
I am trying to create a utility to take the windows relative path from VSCode or IntelliJ and replace the forward slash from the path with backslash and execute the jest command to produce the coverage of the files mentioned in the argument to a script.
This is what I tried:
Script_file.js
const yargs = require('yargs');
const exec = require('child_process').exec;
const options = yargs
.usage('Some info')
.option('t', {some_object})
.option('m', {some_object})
.argv;
const tp = options.t.replace(/\\/g, '/');
const mp = options.m.replace(/\\/g, '/');
exec(`jest ${tp} --coverage --collectCoverageFrom=${mp} --colors`, (error, stdout, stderr) => {
if(error){
console.log('err');
}
console.log(stdout);
})
My script in package.json
"test:CoverageFile": "node Script_file.js",
Command
npm run test:CoverageFile -- -t=windows\style\file\path\for\test -m=windows\style\file\path\for\file\to\get\coverage\from
Now, everything is fine and I'm getting the output as expected too except the marked area in the image attached below[Image is just for reference]:
My question is how can I print this table as well in the output? I'll be highly obliged for any help.
EDIT:
I also tried hooking up the --coverageReporters flag with jest command using various available options but to no avail.
The following is part of a script that is run using npm run test.
async setup() {
process.env.DATABASE_URL = this.databaseUrl;
this.global.process.env.DATABASE_URL = this.databaseUrl;
await exec(`./node_modules/.bin/prisma migrate up --create-db --experimental`);
return super.setup();}
This throws the following error
Command failed: ./node_modules/.bin/prisma migrate up --create-db --experimental
'.' is not recognized as an internal or external command,
operable program or batch file.
When run from the cmd the command works as expected. What is the correct way to reference the binary file within exec()? I am using windows incase that is relevant.
Solution with help from #derpirscher.
const path = require("path");
const prismaBinary = "./node_modules/.bin/prisma";
await exec(
`${path.resolve(prismaBinary)} migrate up --create-db --experimental`
);
I googled
EPERM: operation not permitted
and I got many hits on issues with npm and this error.
This is not my case ( not a duplicate ) as I am not running npm, I am running my own Node code.
I am getting this error:
Error
{
Error: 'EPERM: operation not permitted',
errno: -1,
code: 'EPERM',
syscall: 'scandir',
path: '../../../Library/Application Support/CallHistoryDB/'
};
when running the code below on my home directory.
I ran it using both
node getInfo
and
sudo node getInfo
but I get the same error.
This code works fine if I run it on my local repository but when I try to traverse my entire home directory I get the error.
Executed code:
// Libraries
const fs = require('fs');
const h = require('./helper');
// Start recursing here
const start = '../../../';
// API
getThings(start).then(() => {
console.log(data);
}).catch(h.error)
// Accesses the file system and returns an array of files / folders
async function getThings (folder) {
const things = await fs.promises.readdir(folder);
for(let i = 0; i < things.length; i++) {
await getStats(things[i], folder);
}
}
// Gets statistics for each file/folder
async function getStats (thing, folder) {
const path = folder + thing;
const stats = await fs.promises.stat(path);
await checkForFolder(stats, thing, path);
}
// if the file/folder is a folder, recurse and do it again
async function checkForFolder(stats, thing, path){
// logThing(stats, thing, path);
if (stats.isDirectory() ) {
await getThings(path + '/');
}
}
Research
SO - EPERM vs EACCES
'../../../Library/Application Support/CallHistoryDB/'
This path is protected by macOS security settings, as it may contain sensitive telephone history data.
If you specifically need to access it in your Node application, you will need to enable full disk access for the application you're using to run this script (e.g, Terminal.app, iTerm, or an IDE) in System Preferences, as shown below. Note that this will give all applications you run in the terminal access to your sensitive personal files; tread carefully.
However, if you don't need to specifically access this path (and you probably shouldn't), a better solution may be to catch errors individually on each call to fs.promises.stat(path). The simplest way of doing this will be to wrap the call to await fs.promises.stat(path) in a try … catch block and either print or ignore errors.
I have a function that must find a file(.zip) and return it, that i can use it as argument for another function. Also I have test in karma/jasmine where i lunch my search function and when I do this its throws an error that 'fs.readFile is not a function'
test code:
const fs = require('fs');
const JSZip = require("jszip");
const searchfile = () => {
fs.readFile('./data/2-11253540.zip', function (err, data) {
if (err) throw err;
JSZip.loadAsync(data).then(function (zip) {
console.log('Process Zip: ', zip);
});
});
};
describe('Process', () => {
const process = require('./process');
searchfile();
it('001', () => expect(process()).toEqual(null));
});
it's looks not very similar to what I described above, but it was a test version to check if it work or not. In my karma config I have browserify to handle require.
So, searchfile function search a file and process function will use this file. When I run this test I have error that fs.readFile is not a function.
However if I put a code of searchfile in process function and run it directly it work fine.
Why it's not work ?
Look at the last couple of lines of green text in the screenshot (why are you posting PICTURES of text?!).
You are running the tests in Chrome.
The fs module is a Node.js feature.
The fs module is not available in Chrome (and it would be silly if a web page could read arbitrary files from the visitor’s file system).
I have a folder with a few js files in it:
admin$ ls
filterfiles.js filterfiles.js~ program.js program.js~
program.js is a node program with the following contents:
var dir = process.argv[2]
var fs = require('fs')
fs.readdir(dir, function(results){console.log(results)})
When I do the following, why do I get null, instead of a list of the files in the directory?
admin$ node program.js './'
null
The first argument of the callback for fs.readdir is the error, the result is in argument 2. This is standard practice for node callbacks.
You want:
fs.readdir(dir, function(err,results){console.log(results)})