In node console I do: require('path') or require('assert') => how to find out exactly which file was loaded by the command (the absolute path to the file)
I couldn't find a decisive answer anywhere and I couln't get to it myself... I thought it would be easier than it seems to be...
I don't think this is as simple as you were hoping but using the require object you can do this:
// Load up some modules
var _ = require('lodash');
var natural = require('natural');
// These are where Node will go looking for the modules above
console.log('Paths:');
console.log(require.main.paths);
// You can print out the id of each module, which is the path to its location
console.log('Module IDs:');
require.main.children.forEach((module) => {
console.log(module.id);
});
Output:
$ node index.js
Paths:
[ '/Users/tyler/Desktop/js_test/node_modules',
'/Users/tyler/Desktop/node_modules',
'/Users/tyler/node_modules',
'/Users/node_modules',
'/node_modules' ]
Module IDs:
/Users/tyler/Desktop/js_test/node_modules/lodash/lodash.js
/Users/tyler/Desktop/js_test/node_modules/natural/lib/natural/index.js
As far as I can tell the module IDs will be in the order that you require them, so you should be able to work with their indexes or search through the module ids dynamically for whatever you are looking for.
Related
Let's say I have this directory structure
/Project
/node_modules
/SomeModule
bar.js
/config
/file.json
foo.js
-
foo.js:
require('bar');
-
bar.js:
fs.readdir('./config'); // returns ['file.json']
var file = require('../../../config/file.json');
Is it right that the readdir works from the file is being included (foo.js) and require works from the file it's been called (bar.js)?
Or am I missing something?
Thank you
As Dan D. expressed, fs.readdir uses process.cwd() as start point, while require() uses __dirname. If you want, you can always resolve from one path to another, getting an absolute path both would interpret the same way, like so:
var path = require('path');
route = path.resolve(process.cwd(), route);
That way, if using __dirname as start point it will ignore process.cwd(), else it will use it to generate the full path.
For example, assume process.cwd() is /home/usr/node/:
if route is ./directory, it will become /home/usr/node/directory
if route is /home/usr/node/directory, it will be left as is
I hope it works for you :D
I have added a js Module called mongoUtil, which contains the code hereafter, following a suggestion found at this link.
const MongoClient = require( 'mongodb' ).MongoClient;
const url = "mongodb://localhost:27017";
var _db;
module.exports = {
connectToServer: function(callback) {
MongoClient.connect(url, {useNewUrlParser: true}, function(err, client) {
_db = client.db('MyDB');
return callback(err);
});
},
getDb: function() {
return _db;
}
};
I have furthermore used the following line in my app.js Module:
const mongoUtil = require('mongoUtil')
However, I am obtaining the following error while the 2 Modules are located in the same Directory:
Error: Cannot find module 'mongoUtil'
What am I missing?
If you provide a module name to require it will search node_modules for it.
If you want to read a module from the current directory, you need to use the file path. This can be relative: require("./mongoUtil")
The exact documentation is here including a (rather long) pseudocode explanation of how the algorithm of locating a module works.
But in short, there are two basic ways of loading a module:
Using the name of an installed module (may be globally or locally installed), for example require('mongodb'). This would look for the (global or local) node_modules/mongodb folder.
Using a path (absolute or relative), for example require('./mongoUtil'). This would look for a file at the given path - if it's relative, then it is relative to the current file.
So, the solution is to use require('./mongoUtil') and not require('mongoUtil').
This will work:
const mongoUtil = require('./mongoUtil.js');
Or even just the following, since the extension is automatically resolved:
const mongoUtil = require('./mongoUtil');
I'm trying to build a Electron app which quickly pull the machine info to the user. I'm trying to use the npm module 'shelljs' to be able to use shell script in a node environment. But Electron doesn't really support shelljs so I'm in a bit of a pickle. There is a workaround that includes to use the absolute path to the node binary. Not sure what they mean by that so I thought you guys could help out.
The workaround I got is taken from here where they say this:
Set it like any regular variable.
// This is inside your javascript file
var shell = require('shelljs');
shell.config.execPath = 'path/to/node/binary'; // Replace this with the real path
// The rest of your script...
This is my code where I get an undefined on the execPath:
const shell = require('shelljs')
const path = require('path')
shell.confic.execPath = path.join('C:', 'Program Files', 'nodejs', 'node_modules', 'npm', 'bin')
Am I interpreting the workaround the wrong way?
The spelling error that #Chirag Ravindra pointed out did the trick. After a bit of thinking I came to this solution:
shell.config.execPath = path.join('C:', 'Program Files', 'nodejs', 'node.exe')
//Thomas
So, I'm moving from grunt to gulp (or trying to anyway), and I'm having trouble getting gulp to do what I'm doing in grunt. Specifically the $templateCache stuff.
My angular app is broken up into several components/modules. Each module contains everything it needs to run (controllers, directives, partials, scss, etc.).
Using Grunt, I've been able to boil each module down into 5 files:
module.min.css // all module scss files compiled and concatenated
module.min.js // all module controllers, directives, services, etc. concatenated
module.tpls.min.js // all partials in $templateCache for this module
module.mocks.min.js // all unit test mock objects for this module
module.specs.min.js // all unit test specs for this module
This has worked really well for 2 years now and been a cornerstone of my modular architecture. My only reasons to try out gulp was 1) Curiosity, 2) My grunt file is getting kinda hairy as we add in deployment and environment specific stuff and so far gulp has really slimmed that down.
For the most part, I've figured out how to do all my grunt tasks in gulp, but I'm having trouble figuring out how to generate a template cache file for each module. All the gulp-ng|angular-templates|templatecache plugins take all my partials and create one file. I'd like to take all my files under module/partials/*.html and create a single module.tpls.min.js; and do that for each module.
This was actually a problem with grunt too, but I figured it out with grunt.file.expand().forEach() like this:
grunt.registerTask('prepModules', '...', function(){
// loop through our modules directory and create subtasks
// for each module, modifying tasks that affect modules.
grunt.file.expand("src/js/modules/*").forEach(function (dir) {
// get the module name by looking at the directory we're in
var mName = dir.substr(dir.lastIndexOf('/') + 1);
// add ngtemplate subtasks for each module, turning
// all module partials into $templateCache objects
ngtemplates[mName] = {
module: mName,
src: dir + "/partials/**/*.html",
dest: 'dev/modules/' + mName + '/' + mName + '.tpls.min.js'
};
grunt.config.set('ngtemplates', ngtemplates);
});
});
My current gulp for this same task:
var compileTemplates = gulp.src('./src/js/modules/**/partials/*.html', {base:'.'})
.pipe(ngTemplates())
.pipe(gulp.dest('.'));
I've only really looked at the options, but none of them seemed to do what I wanted. They were all around changing the file name, or the final destination of the file, or a module name, or whatever else; nothing that said anything about doing it for only the directory it happens to be in.
I had thought about using gulp-rename because it worked well for me when doing the CSS compilation:
var compileScss = gulp.src('./src/js/modules/**/scss/*.scss', {base:'.'})
.pipe(sass({includePaths: ['./src/scss']}))
.pipe(rename(function(path){
path.dirname = path.dirname.replace(/scss/,'css');
}))
.pipe(gulp.dest('.'));
However, when I pipe rename() after doing ngTemplates() it only has the path of the final output file (one log entry). When you console.log() path after sass(), it has all the paths of all the files that it found (lots of log entries).
Any ideas? Thanks!
This SO post has the correct answer, but the wasn't coming up in my searches for this specific usage. I was going to vote to close my question, but since someone else might search using my own specific terms (since I did), it seems more appropriate to leave it alone and just redirect to the original question as well as show how I solved my own particular problem.
var fs = require('fs');
var ngTemplates = require('gulp-ng-templates');
var rename = require('gulp-rename');
var modulesDir = './src/js/modules/';
var getModules = function(dir){
return fs.readdirSync(dir)
.filter(function(file){
return fs.statSync(path.join(dir, file)).isDirectory();
});
};
gulp.task('default', function(){
var modules = getModules(modulesDir);
var moduleTasks = modules.map(function(folder){
// get all partials for this module
// parse into $templateCache file
// rename to be /dev/modules/_____/______.tpls.min.js
return gulp.src(modulesDir + folder + '/partials/*.html', {basedir:'.'})
.pipe(ngTemplates({module:folder}))
.pipe(rename(function(path){
path.dirname = './dev/apps/' + folder + '/';
path.basename = folder + '.tpls.min';
}))
.pipe(gulp.dest('.'));
});
});
It's essentially like the tasks per folder recipe but with a change to use gulp-ng-templates. I'll probably be using this same pattern for my SCSS and JS now that I'm more aware of it.
Seems like the gulp equivalent of grunt.file.expand().forEach().
Whenever I deal with scss/sass for gulp tasks, I will only put one scss file as the source parameter. This parameter file is composed of a list of imports. This way you don't need to rely on gulp to concat the scss file contents for you.
//in gulpfile
gulp.src('./src/js/modules/**/scss/main.scss', {base:'.'})
//in main.scss
#import 'a', 'b', 'c';
a, b, and c would represent your other scss files.
As i could see, the Gjs imports, loads only /usr/share/gjs-1.0 and /usr/lib/gjs-1.0 by default. I want to modularize an application, like we can do with node, but i must find modules relative to the script file.
I found this two ways to add include paths:
gjs --include-path=my-modules my-script.js
GJS_PATH=my-modules gjs my-script.js
...but both are related to the current directory, not to the file (obliviously), and they needed to be declared on the command line, making this unnecessarily complex.
How can i set a including path in the Gjs code? (So i can make this relative to the file)
Or... There is another way to import files from anywhere, like in python?
(Please, you don't need to propose to use a shellscript launcher to solve the --include-path and GJS_PATH problem. That is obvious, but less powerful. If we do not have a better solution, we survive with that.)
You need to set or modify imports.searchPath (which is not obvious because it doesn't show up with for (x in imports)print(x)). So this:
imports.searchPath.unshift('.');
var foo = imports.foo;
imports the file “foo.js” as the foo object.
This is compatible with Seed, although there imports knows it has a searchPath.
(Earlier versions of this answer were substantially less accurate and more inflammatory. Sorry).
As Douglas says, you do need to modify imports.searchPath to include your library location. Using . is simple, but depends on the files always being run from the same directory location. Unfortunately finding the directory of the currently executing script is a huge hack. Here's how Gnome Shell does it for the extensions API
I've adapted this into the following function for general use:
const Gio = imports.gi.Gio;
function getCurrentFile() {
let stack = (new Error()).stack;
// Assuming we're importing this directly from an extension (and we shouldn't
// ever not be), its UUID should be directly in the path here.
let stackLine = stack.split('\n')[1];
if (!stackLine)
throw new Error('Could not find current file');
// The stack line is like:
// init([object Object])#/home/user/data/gnome-shell/extensions/u#u.id/prefs.js:8
//
// In the case that we're importing from
// module scope, the first field is blank:
// #/home/user/data/gnome-shell/extensions/u#u.id/prefs.js:8
let match = new RegExp('#(.+):\\d+').exec(stackLine);
if (!match)
throw new Error('Could not find current file');
let path = match[1];
let file = Gio.File.new_for_path(path);
return [file.get_path(), file.get_parent().get_path(), file.get_basename()];
}
Here's how you might use it from your entry point file app.js, after defining the getCurrentFile function:
let file_info = getCurrentFile();
// define library location relative to entry point file
const LIB_PATH = file_info[1] + '/lib';
// then add it to the imports search path
imports.searchPath.unshift(LIB_PATH);
Wee! now importing our libraries is super-easy:
// import your app libraries (if they were in lib/app_name)
const Core = imports.app_name.core;