How to skip Yeoman generator step when invoked from node script - javascript

I have a node js script that invokes a Yeoman generator that I wrote and I would like to skip the prompting step since I'm passing the data for the generator from the script. I searched the documentation but I didn't find anything relevant for this. Is it possible at all?
My script looks like this
const yeoman = require('yeoman-environment');
const env = yeoman.createEnv();
env.lookup(() => {
env.run('mygenerator:subgenerator --moduleName Test3', {'skip-install': true, 'skip-prompting': true }, err => {
console.log('done');
});
});
And my generator has nothing special:
const BaseGenerator = require('./../base/index.js');
module.exports = class extends BaseGenerator {
constructor(args, opts) {
super(args, opts);
this.props = opts;
const destinationFolder = args.destinationFolder || '';
const moduleName = args.moduleName || '';
this.props = {
moduleName,
destinationFolder,
};
}
prompting() {
//...
}
writing() {
//...
}
};
I know that the generator gets the data I'm passing from the script. I potentially I could have a generator which deals with input and another one only for writing the files. But it'd be nice to have only one code and be able to skip some steps.
I saw in some stackoverflow answers that people pass the { 'skip-install': true } option to the generator. Then I tried to pass { 'skip-prompting': true }, but it doesn't do anything.
Thank you!
EDIT
The way that I solved this is the following:
All my sub generators extend a BaseGenerator that I wrote, which is the one that extends from Yeoman. In my BaseGenerator I added this method:
shouldPrompt() {
return typeof this.props.options === 'undefined' ||
(typeof this.props.options.moduleName === 'undefined' &&
typeof this.props.options.destinationFolder === 'undefined');
}
I only use 2 parameters in my generators, moduleName and destinationFolder. So, that's all I want to check. Then, in the sub generators I added this:
prompting() {
if (this.shouldPrompt()) {
this.log(chalk.red('Reducer generator'));
const prompts = [ /*...*/ ];
return this.prompt(prompts).then((props) => { this.props.options = props; });
}
}

You'll want to define options or arguments to accept these arguments from the terminal: http://yeoman.io/authoring/user-interactions.html
Then, just use JavaScript to run or not the this.prompt() call (with if/else structure or any other conditional that works for your use case)
Remember that Yeoman is still only JS code :)

Related

Dynamic init to variable in a npm package

I have an imported library from npm and some parts of it need to be initialized before use, a simplified version of the code in the library:
export let locale = () => { throw new Error("Must init locale"); }
export initLocale(userLocaleFunction) {
locale = userLocaleFunction;
}
export checkLocale() {
console.log(locale());
}
But when calling the library in the following way:
lib = require("lib");
lib.initLocale(() => { return "en" });
lib.checkLocale(); // works as expected: "en"
lib.locale(); // Throws "Must init locale";
lib.locale acts as if it's not been initialized. I can't have initLocale() return the locale, I need it to be on the variable lib.locale
Is it possible to initialize a variable in this way?
It seems that when initializing a variable inside a library it's only in the libraries scope.
In my first solution I simply returned the value:
export initLocale(userLocaleFunction) {
locale = userLocaleFunction;
return locale;
}
But then realized that this creates a new problem: What if locale gets modified inside the library, or worse, outside of it?
In the spirit of avoiding 2 sources of truth I ended up going with this:
locale = undefined;
export initLocale(userLocaleFunction) {
locale = userLocaleFunction;
}
export getLocale() {
if (locale === undefined) {
throw new Error("Uninitialized locale");
}
return locale;
}
This code performs the is initialized check I needed at first and gives the value with one source of truth.

Javascript proxy handle return statement

I am using a proxy to handle the creation of an object without have to declare all "parent key" of the object.
var target = {};
var config = Proxy_ObjectCreator(target, handlers);
config.foo.bar = "Didn't need to create foo !";
return target;
Instead of this
var config = {
foo : {
bar : "needed to create foo ..."
}
};
return config;
That part is fine and functional, but the issue I'm having is that it often happens that I forget that I need to return the target object instead of the proxy, which often creates really strange behaviors in my code.
For what I am aware of, there is no way for a Proxy object to handle the return statement. And so I'm trying to find a way to do exactly that. Something like that:
var config = Proxy_ObjectCreator({}, {
get: (obj, prop, receiver) => {
//Do stuff ...
},
set: (obj, prop, receiver) => {
//Do stuff ...
},
return: () => {
return this.target;
}
});
config.foo.bar = "Didn't need to create a target obj and don't need to return it !";
return config;
Is there anyway to achieve that ?
Thanks you.
Edit
I'm using that to override configurations from a larger product to the client's specifications. So I'm working with hundreds of lines of configurations. Using a proxy allows me to structure the configuration file for future and clearer reading. It also allows me to group configuration's that can be all over the original config file together to comment on the reason why they are changed, without having to scroll up and down the object.
//Normal Object Case
function getClientConfigs() = {
return {
config1: {
foo: {
bar: {
foofoo: {
barbar: "value to override"
}
}
}
},
// hundreds of other configs
config2:{
foo: "other value to override"
}
};
}
//With proxy
function getClientConfigs() = {
var config = {};
var proxy = Proxy_ObjectCreator(config, handlers);
// Changing because client wanted that for x reason
proxy.config1.foo.bar.foofoo.barbar = "value to override";
proxy.config2.foo = "other value to override";
return config;
}
$.extend(originalConfig, getClientConfigs());
No, there is no way to get the proxy recognise when it is returned from a function. (Btw, it also is returned from your Proxy_ObjectCreator function, so you'd need to explicitly ignore that…). No, that would get way too complicated.
But you could use a different design pattern - don't make Proxy_ObjectCreator a factory function, instead give it a callback so that the proxy cannot (easily) escape from the context:
function getClientConfigs() = {
return withProxiedCreation({}, proxy => {
proxy.config1.foo.bar.foofoo.barbar = "value to override";
proxy.config2.foo = "other value to override";
});
}
function withProxiedCreation(target, callback) {
var proxy = new Proxy(target, handlers);
callback(proxy);
return target;
}

JS-Interpreter - changing “this” context

JS-Interpreter is a somewhat well-known JavaScript Interpreter. It has security advantages in that it can completely isolate your code from document and allows you to detect attacks such as infinite loops and memory bombs. This allows you to run externally defined code safely.
I have an object, say o like this:
let o = {
hidden: null,
regex: null,
process: [
"this.hidden = !this.visible;",
"this.regex = new RegExp(this.validate, 'i');"
],
visible: true,
validate: "^[a-z]+$"
};
I'd like to be able to run the code in process through JS-Interpreter:
for (let i = 0; i < o.process.length; i++)
interpretWithinContext(o, o.process[i]);
Where interpretWithinContext will create an interpreter using the first argument as the context, i.e. o becomes this, and the second argument is the line of code to run. After running the above code, I would expect o to be:
{
hidden: false,
regex: /^[a-z]+$/i,
process: [
"this.hidden = !this.visible;",
"this.regex = new RegExp(this.validate, 'i');"
],
visible: true,
validate: '^[a-z]+$'
}
That is, hidden and regex are now set.
Does anyone know if this is possible in JS-Interpreter?
I’ve spent a while messing around with the JS-Interpreter now, trying to figure out from the source how to place an object into the interpreter’s scope that can be both read and modified.
Unfortunately, the way this library is built, all the useful internal things are minified so we cannot really utilize the internal things and just put an object inside. Attempts to add a proxy object also failed failed since the object just wasn’t used in a “normal” way.
So my original approach to this was to just fall back to providing simple utility functions to access the outside object. This is fully supported by the library and probably the safest way of interacting with it. It does require you to change the process code though, in order to use those functions. But as a benefit, it does provide a very clean interface to communicate with “the outside world”. You can find the solution for this in the following hidden snippet:
function createInterpreter (dataObj) {
function initialize (intp, scope) {
intp.setProperty(scope, 'get', intp.createNativeFunction(function (prop) {
return intp.nativeToPseudo(dataObj[prop]);
}), intp.READONLY_DESCRIPTOR);
intp.setProperty(scope, 'set', intp.createNativeFunction(function (prop, value) {
dataObj[prop] = intp.pseudoToNative(value);
}), intp.READONLY_DESCRIPTOR);
}
return function (code) {
const interpreter = new Interpreter(code, initialize);
interpreter.run();
return interpreter.value;
};
}
let o = {
hidden: null,
regex: null,
process: [
"set('hidden', !get('visible'));",
"set('regex', new RegExp(get('validate'), 'i'));"
],
visible: true,
validate: "^[a-z]+$"
};
const interprete = createInterpreter(o);
for (const process of o.process) {
interprete(process);
}
console.log(o.hidden); // false
console.log(o.regex); // /^[a-z]+$/i
<script src="https://neil.fraser.name/software/JS-Interpreter/acorn_interpreter.js"></script>
However, after posting above solution, I just couldn’t stop thinking about this, so I dug deeper. As I learned, the methods getProperty and setProperty are not just used to set up the initial sandbox scope, but also as the code is being interpreted. So we can use this to create a proxy-like behavior for our object.
My solution here is based on code I found in an issue comment about doing this by modifying the Interpreter type. Unfortunately, the code is written in CoffeeScript and also based on some older versions, so we cannot use it exactly as it is. There’s also still the problem of the internals being minified, which we’ll get to in a moment.
The overall idea is to introduce a “connected object” into the scope which we will handle as a special case inside the getProperty and setProperty to map to our actual object.
But for that, we need to overwrite those two methods which is a problem because they are minified and received different internal names. Fortunately, the end of the source contains the following:
// Preserve top-level API functions from being pruned/renamed by JS compilers.
// …
Interpreter.prototype['getProperty'] = Interpreter.prototype.getProperty;
Interpreter.prototype['setProperty'] = Interpreter.prototype.setProperty;
So even if a minifier mangles the names on the right, it won’t touch the ones on the left. So that’s how the author made particular functions available for public use. But we want to overwrite them, so we cannot just overwrite the friendly names, we also need to replace the minified copies! But since we have a way to access the functions, we can also search for any other copy of them with a mangled name.
So that’s what I’m doing in my solution at the beginning in patchInterpreter: Define the new methods we’ll overwrite the existing ones with. Then, look for all the names (mangled or not) that refer to those functions, and replace them all with the new definition.
In the end, after patching the Interpreter, we just need to add a connected object into the scope. We cannot use the name this since that’s already used, but we can just choose something else, for example o:
function patchInterpreter (Interpreter) {
const originalGetProperty = Interpreter.prototype.getProperty;
const originalSetProperty = Interpreter.prototype.setProperty;
function newGetProperty(obj, name) {
if (obj == null || !obj._connected) {
return originalGetProperty.call(this, obj, name);
}
const value = obj._connected[name];
if (typeof value === 'object') {
// if the value is an object itself, create another connected object
return this.createConnectedObject(value);
}
return value;
}
function newSetProperty(obj, name, value, opt_descriptor) {
if (obj == null || !obj._connected) {
return originalSetProperty.call(this, obj, name, value, opt_descriptor);
}
obj._connected[name] = this.pseudoToNative(value);
}
let getKeys = [];
let setKeys = [];
for (const key of Object.keys(Interpreter.prototype)) {
if (Interpreter.prototype[key] === originalGetProperty) {
getKeys.push(key);
}
if (Interpreter.prototype[key] === originalSetProperty) {
setKeys.push(key);
}
}
for (const key of getKeys) {
Interpreter.prototype[key] = newGetProperty;
}
for (const key of setKeys) {
Interpreter.prototype[key] = newSetProperty;
}
Interpreter.prototype.createConnectedObject = function (obj) {
const connectedObject = this.createObject(this.OBJECT);
connectedObject._connected = obj;
return connectedObject;
};
}
patchInterpreter(Interpreter);
// actual application code
function createInterpreter (dataObj) {
function initialize (intp, scope) {
// add a connected object for `dataObj`
intp.setProperty(scope, 'o', intp.createConnectedObject(dataObj), intp.READONLY_DESCRIPTOR);
}
return function (code) {
const interpreter = new Interpreter(code, initialize);
interpreter.run();
return interpreter.value;
};
}
let o = {
hidden: null,
regex: null,
process: [
"o.hidden = !o.visible;",
"o.regex = new RegExp(o.validate, 'i');"
],
visible: true,
validate: "^[a-z]+$"
};
const interprete = createInterpreter(o);
for (const process of o.process) {
interprete(process);
}
console.log(o.hidden); // false
console.log(o.regex); // /^[a-z]+$/i
<script src="https://neil.fraser.name/software/JS-Interpreter/acorn_interpreter.js"></script>
And that’s it! Note that while that new implementation does already work with nested objects, it may not work with every type. So you should probably be careful what kind of objects you pass into the sandbox. It’s probably a good idea to create separate and explicitly safe objects with only basic or primitive types.
Have not tried JS-Interpreter. You can use new Function() and Function.prototype.call() to achieve requirement
let o = {
hidden: null,
regex: null,
process: [
"this.hidden = !this.visible;",
"this.regex = new RegExp(this.validate, 'i');"
],
visible: true,
validate: "^[a-z]+$"
};
for (let i = 0; i < o.process.length; i++)
console.log(new Function(`return ${o.process[i]}`).call(o));
Hi may be interpretWithinContext look like something like that ?
let interpretWithinContext = (function(o, p){
//in dunno for what you use p because all is on object o
o.hidden = (o.hidden === null) ? false : o.hidden;
o.regex = (o.regex === null) ? '/^[a-z]+$/i' : o.regex;
console.log(o);
return o;
});
https://codepen.io/anon/pen/oGwyra?editors=1111

Best approach to avoid multiple check conditions in Javascript

In order to write quality code with good readability, I'm adopting currying functions approach and making pure helper functions for most of the repetitive code snippets. I just observed that I’m having an existence/type check everywhere in my project to avoid any possible errors like type of undefined.
The checks are like:
if (param){
action...
}
I'm thinking to create a global helper function that should take two parameters; param that need to be checked and the action function to perform the action in case the check passes. Something like:
function isExist(param, action){
if (param){
action();
}
}
This functions is not ideally working for all snippets/cases. How can i make it efficient and globally functional for all cases? Also is this the right approach. If not then what is the best approach that i should follow to achieve my aim here?
Example:
if (userInput){
saveToDB(userInput);
}
if (valueFromDB){
performSomeAction();
}
if (username && password){
validate(username, password)
}
I want all of these checks at different points in my code to be replaced by single helper function to somewhat like:
isExist( userInput, saveToDB(userInput) );
isExist( valueFromDB, performSomeAction );
isExist( (username && password), validate(username, password) );
In this way we've replaced this 9 lines of code with just three lines. This is what I wanna achieve.
Well, if you try to think of a good name for
function isExist(param, action){
if (param){
action();
}
}
Then I think one good candidate would be conditionalExecute(condition, codeToExecute). Does this kind of work sound familiar? Are you sure you're not just reinventing the if-statement itself?
Maybe I'm missing your point, but I can't personally see the benefit of encapsulating the logic of the if-statement more than it already is.
Edit: It should be noted that within the context of Javascript the code
if(someVariable){
// do something
}
already reads like "If someVariable is truthy (which undefined is not) then....
But sure, if you only want to check for existance (a variable not being undefined) I won't argue against you if you say it's preferable to have a named function that makes that clear.
In that case I think it's clearer to only encapsulate the actual existence check (or what ever you want to check), not the conditional nature (because for that we already have the if-statement). So something like
function exists(x) {
return x !== undefined; // or something like that
}
function isNotNull(x) {
//TODO:
}
Then your code would become more explicit and readable, and you could combine the functions if you wanted
function neitherUndefinedNorNull(x){
return exists(x) && isNotNull(x);
}
if(neitherUndefinedNorNull(X)){
// your "regular" code here
}
If the code inside of the if-statement is repeated, then extract that as a function as well.
function myRepeatedCode() {
// do stuff
}
function someAlternativeCondition(x){
// test
}
if(neitherUndefinedNorNull ){
myRepeatedCode();
} else if(someAlternativeCondition(x)) {
myRepeatedCode();
}
// OR combine them in the same if-statement
if(neitherUndefinedNorNull(x) || someAlternativeCondition(x)){
myRepeatedCode();
}
Last edit: If you're chasing characters you could even write
// because of short-circuiting, myFunc1 and myFunc2 will only
// execute if myCond1 resp myCond2 is true (or truthy).
myCond1(x) && myFunc1(x)
myCond2(y) && myFunc2(y)
This is the perfect place to use Maybe:
const enumerable = true;
// data Maybe a = Nothing | Just a
const Maybe = {};
const Nothing = Object.create(Maybe);
const Just = value => Object.create(Maybe, {value: {enumerable, value}});
// instance Functor Maybe where
Nothing.map = _ => Nothing;
Maybe.map = function (fun) { return Just(fun(this.value)); };
// instance Applicative Maybe where
Maybe.of = Just;
Nothing.ap = _ => Nothing;
Maybe.ap = function (maybe) { return maybe.map(this.value); };
// instance Monad Maybe where
Nothing.chain = _ => Nothing;
Maybe.chain = function (kleisli) { return kleisli(this.value); };
Maybe follows the Fantasy Land Specification[1]. Using Maybe allows you to write code like this:
// userInput :: Maybe Data
// saveToDB :: Data -> Something
userInput.map(saveToDB); // :: Maybe Something
// valueFromDB :: Maybe Data
// performSomeAction :: Data -> Maybe Something
valueFromDB.chain(performSomeAction); // :: Maybe Something
// username :: Maybe String
// password :: Maybe Password
// validate :: String -> Password -> Something
Maybe.of(validate).ap(username).ap(password); // :: Maybe Something
Anyway, if you're really interested in functional programming then I suggest that you Learn You A Haskell.
[1] I don't agree with the Fantasy Land Specification on flipping the arguments of ap.
how about this, it can process the parameters at same time.
function test(a,b,c)
{
console.log("%s,%s,%s",a,b,c)
}
function check_and_run(param,action){
var args = Array.prototype.slice.call(arguments); //turn arguments to array
args.shift(); //remove param and action
args.shift();
if(param)
action.apply(this,args)
}
check_and_run(1,test,1,2,3) //this will invoke test(1,2,3)
check_and_run(0,test,1,2,3) //this will do nothing
Perhaps something like this:
function conFun(fnCondition, fnCall, defaultResult=undefined) {
return (...rest) => {
if( fnCondition(...rest) ) {
return fnCall(...rest)
}
return defaultResult;
}
}
const add = conFun(
(...rest) => rest.every(n => typeof n === 'number'),
(...rest) => rest.reduce((a, n) => a+n),
NaN);
add("1", "2"); //=> NaN
add(1, 2); //=> 3
So in your question you might be after the first argument not being undefined:
const firstDefined = (v) => typeof v !== 'undefined';
const cSomeFun = conFun(firstDefined, someFun, "");
cSomeFun(); // ==> ""
cSomeFun("test"); // ==> whatever someFun("test") returns
If you are just looking to call something based on non undefined arguments you can simply define it like this:
function callDefined(fn, ...rest) {
if( rest.every(firstDefined) ) {
return fn(...rest)
}
return undefined;
}
callDefined( saveToDB.bind(this, userInput), userInput);
callDefined( performSomeAction, valueFromDB);
callDefined( calidate.bind(this, username, password), username, password);

Best way to export Express route methods for promise chains?

I have an API route that is being refactored to use ES6 promises to avoid callback hell.
After successfully converting to a promise chain, I wanted to export my .then() functions to a separate file for cleanliness and clarity.
The route file:
The functions file:
This works fine. However, what I'd like to do is move the functions declared in the Class constructor() function into independent methods, which can reference the values instantiated by the constructor. That way it all reads nicer.
But, when I do, I run into scoping problems - this is not defined, etc. What is the correct way to do this? Is an ES6 appropriate to use here, or should I use some other structure?
RAW CODE:
route...
.post((req, res) => {
let SubmitRouteFunctions = require('./functions/submitFunctions.js');
let fn = new SubmitRouteFunctions(req, res);
// *******************************************
// ***** THIS IS WHERE THE MAGIC HAPPENS *****
// *******************************************
Promise.all([fn.redundancyCheck, fn.getLocationInfo])
.then(fn.resetRedundantID)
.then(fn.constructSurveyResult)
.then(fn.storeResultInDB)
.then(fn.redirectToUniqueURL)
.catch((err) => {
console.log(err);
res.send("ERROR SUBMITTING YOUR RESULT: ", err);
});
})
exported functions...
module.exports = class SubmitRouteFunctions {
constructor (req, res) {
this.res = res;
this.initialData = {
answers : req.body.responses,
coreFit : req.body.coreFit,
secondFit : req.body.secondFit,
modules : req.body.modules,
};
this.newId = shortid.generate();
this.visitor = ua('UA-83723251-1', this.newId, {strictCidFormat: false}).debug();
this.clientIp = requestIp.getClientIp(req);
this.redundancyCheck = mongoose.model('Result').findOne({quizId: this.newId});
this.getLocationInfo = request.get('http://freegeoip.net/json/' + this.clientIp).catch((err) => err);
this.resetRedundantID = ([mongooseResult, clientLocationPromise]) => {
console.log(mongooseResult);
if (mongooseResult != null) {
console.log('REDUNDANT ID FOUND - GENERATING NEW ONE')
this.newId = shortid.generate();
this.visitor = ua('UA-83723251-1', this.newId, {strictCidFormat: false});
console.log('NEW ID: ', this.newId);
};
return clientLocationPromise.data;
}
this.constructSurveyResult = (clientLocation) => {
let additionalData = {quizId: this.newId, location: clientLocation};
return Object.assign({}, this.initialData, additionalData);
}
this.storeResultInDB = (newResult) => mongoose.model('Result').create(newResult).then((result) => result).catch((err) => err);
this.redirectToUniqueURL = (mongooseResult) => {
let parsedId = '?' + queryString.stringify({id: mongooseResult.quizId});
let customUrl = 'http://explore-your-fit.herokuapp.com/results' + parsedId;
this.res.send('/results' + parsedId);
}
}
}
ALTERNATIVE #1:
Rather than using ES6 classes, an alternate way to perform the same behavior that cleans up the code just a little bit is to export an anonymous function as described by Nick Panov here: In Node.js, how do I "include" functions from my other files?
FUNCTIONS FILE:
module.exports = function (req, res) {
this.initialData = {
answers : req.body.responses,
coreFit : req.body.coreFit,
secondFit : req.body.secondFit,
modules : req.body.modules,
};
this.newId = shortid.generate();
this.visitor = ua('UA-83723251-1', this.newId, {strictCidFormat: false}).debug();
this.clientIp = requestIp.getClientIp(req);
this.redundancyCheck = mongoose.model('Result').findOne({quizId: this.newId});
this.getLocationInfo = request.get('http://freegeoip.net/json/' + this.clientIp).catch((err) => err);
this.resetRedundantID = ([mongooseResult, clientLocationPromise]) => {
if (mongooseResult != null) {
console.log('REDUNDANT ID FOUND - GENERATING NEW ONE')
this.newId = shortid.generate();
this.visitor = ua('UA-83723251-1', this.newId, {strictCidFormat: false});
console.log('NEW ID: ', this.newId);
};
return clientLocationPromise.data;
}
this.constructSurveyResult = (clientLocation) => {
let additionalData = {quizId: this.newId, location: clientLocation};
return Object.assign({}, this.initialData, additionalData);
}
this.storeResultInDB = (newResult) => mongoose.model('Result').create(newResult).then((result) => result).catch((err) => err);
this.redirectToUniqueURL = (mongooseResult) => {
let parsedId = '?' + queryString.stringify({id: mongooseResult.quizId});
let customUrl = 'http://explore-your-fit.herokuapp.com/results' + parsedId;
res.send('/results' + parsedId);
}
}
Although this does not avoid having to tag each method with this.someFn()..., as I originally wanted, it does take an extra step in the routing file - doing things this way prevents me from having to assign a specific namespace to the methods.
ROUTES FILE
.post((req, res) => {
require('./functions/submitFunctions_2.js')(req, res);
Promise.all([redundancyCheck, getLocationInfo])
.then(resetRedundantID)
.then(constructSurveyResult)
.then(storeResultInDB)
.then(redirectToUniqueURL)
.catch((err) => {
console.log(err);
res.send("ERROR SUBMITTING YOUR RESULT: ", err);
});
})
The functions are reset to reflect each new req and res objects as POST requests hit the route, and the this keyword is apparently bound to the POST route callback in each of the imported methods.
IMPORTANT NOTE: You cannot export an arrow function using this method. The exported function must be a traditional, anonymous function. Here's why, per Udo G's comment on the same thread:
It should be worth to note that this works because this in a function is the global scope when the function is called directly (not bound in any way).
ALTERNATIVE #2:
Another option, courtesy of Bergi from: How to use arrow functions (public class fields) as class methods?
What I am looking for, really, is an experimental feature....
There is an proposal which might allow you to omit the constructor() and directly put the assignment in the class scope with the same functionality, but I wouldn't recommend to use that as it's highly experimental.
However, there is still a way to separate the methods:
Alternatively, you can always use .bind, which allows you to declare the method on the prototype and then bind it to the instance in the constructor. This approach has greater flexibility as it allows modifying the method from the outside of your class.
Based on Bergi's example:
module.exports = class SomeClass {
constructor() {
this.someMethod= this.someMethod.bind(this);
this.someOtherMethod= this.someOtherMethod.bind(this);
…
}
someMethod(val) {
// Do something with val
}
someOtherMethod(val2) {
// Do something with val2
}
}
Obviously, this is more in-line with what I was originally looking for, as it enhances the overall readability of the exported code. BUT doing so will require that you assign a namespace to the new class in your routes file like I did originally:
let SubmitRouteFunctions = require('./functions/submitFunctions.js');
let fn = new SubmitRouteFunctions(req, res);
Promise.all([fn.redundancyCheck, fn.getLocationInfo])
.then(...)
PROPOSED / EXPERIMENTAL FEATURE:
This is not really my wheelhouse, but per Bergi, there is currently a Stage-2 proposal (https://github.com/tc39/proposal-class-public-fields) that is attempting to get "class instance fields" added to the next ES spec.
"Class instance fields" describe properties intended to exist on
instances of a class (and may optionally include initializer
expressions for said properties)
As I understand it, this would solve the issue described here entirely, by allowing methods attached to class objects to reference each instantiation of itself. Therefore, this issues would disappear and methods could optionally be bound automatically.
My (limited) understanding is that the arrow function would be used to accomplish this, like so:
class SomeClass {
constructor() {...}
someMethod (val) => {
// Do something with val
// Where 'this' is bound to the current instance of SomeClass
}
}
Apparently this can be done now using a Babel compiler, but is obviously experimental and risky. Plus, in this case we're trying to do this in Node / Express which makes that almost a moot point :)

Categories

Resources