SailsJS custom config files - javascript

I am aware I can create a custom file inside the config directory and reference the variables from within that
module.exports.myconfig = {
foo: 'bar'
}
sails.config.myconfig.foo
But I need to write to these variables too and have them saved. In previous projects I have done this with JSON config files and used PHP to write to them.
Is there any way of doing this with Sails or should I just create some JSON files to pull and push my config vars?

There's no mechanism built in to Sails for persisting configuration variables. However, in the latest build of Sails there is a lower event you can listen for which indicates that Sails is exiting. You could catch this and persist your data then. For example, in your /config/bootstrap.js, something like:
var fs = require('fs');
module.exports = function(cb) {
sails.on('lower', function persistConfig() {
fs.writeFileSync(sails.appPath+'/config/myConfig.js',
'module.exports = ' + JSON.stringify(sails.config.myconfig));
});
// ... other bootstrap stuff ...
return cb();
}

Related

Application modularity with Vue.js and local NPM packages

I'm trying to build a modular application in Vue via the vue-cli-service. The main app and the modules are separated projects living in different folders, the structure is something like this:
-- app/package.json
/src/**
-- module1/package.json
/src**
-- module2/package.json
/src**
The idea is to have the Vue app completely agnostic about the application modules that can be there at runtime, the modules themself are compiled with vue-cli-service build --target lib in a local moduleX/dist folder, pointed with the package.json "main" and "files" nodes.
My first idea (now just for development speed purposes) was to add the modules as local NPM packages to the app, building them with a watcher and serving the app with a watcher itself, so that any change to the depending modules would (I think) be distributed automatically to the main app.
So the package.json of the app contains dependencies like:
...
"module1": "file:../module1",
"module2": "file:../module2",
...
This dependencies are mean to be removed at any time, or in general be composed as we need, the app sould just be recompiled and everything should work.
I'm trying to understand now how to dynamically load and activate the modules in the application, as I cannot use the dynamic import like this:
import(/* webpackMode: "eager" */ `module1`).then(src => {
src.default.boot();
resolve();
});
Because basically I don't know the 'module1', 'module2', etc...
In an OOP world I would just use dependency injection retrieving classes implementing a specific interface, but in JS/TS I'm not sure it is viable.
There's a way to accomplish this?
Juggling with package.json doesn't sound like a good idea to me - doesn't scale. What I would do:
Keep all available "modules" in package.json
Create separate js file (or own prop inside package.json) with all available configurations (for different clients for example)
module.exports = {
'default': ['module1', 'module2', 'module3'],
'clientA': ['module1', 'module2', 'module4'],
'clientB': ['module2', 'module3', 'module4']
}
tap into VueCLI build process - best example I found is here and create js file which will run before each build (or "serve") and using simple template (for example lodash) generate new js file which will boot configured modules based on the value of some ENV variable. See following (pseudo)code (remember this runs inside node during build):
const fs = require('fs')
const _ = require('lodash')
const modulesConfig = require(`your module config js`)
const configurationName = process.env.MY_APP_CONFIGURATION ?? 'default'
const modules = modulesConfig[configurationName]
const template = fs.loadFileSync('name of template file')
const templateCompiled = _.template(template)
const generatedJS = templateCompiled({ `modules`: modules })
fs.writeFileSync('bootModules.js', generatedJS)
Write your template for bootModules.js. Simplest would be:
<% _.forEach(modules , function(module) { %>import '<%= module %>' as <%= module %><% }); %>;
import bootModules.js into your app
Use MY_APP_CONFIGURATION ENV variable to switch desired module configuration - works not just during development but you can also setup different CI processes targeting same repo with just different MY_APP_CONFIGURATION values
This way you have all configurations at one place, you don't need to change package.json before every build, you have simple mechanism to switch between different module configurations and every build (bundle) contains only the modules needed....
In an OOP world I would just use dependency injection retrieving classes implementing a specific interface, but in JS/TS I'm not sure it is viable.
Why not?
More than this, with JS/TS you are not restricted to use classes implementing a specific interface: you just need to define the interface (i.e. the module.exports) of your modules and respecting it in the libraries entries (vue build lib).
EDIT: reading comments probably I understood the request.
Each module should respect following interface (in the file which is the entry of the vue library)
export function isMyAppModule() {
return true;
}
export function myAppInit() {
return { /* what you need to export */ };
}
Than in your app:
require("./package.json").dependencies.forEach(name => {
const module = require(name);
if(! module.isMyAppModule || module.isMyAppModule() !== true) return;
const { /* the refs you need */ } = module.myAppInit();
// use your refs as you need
});

Requiring files in electron without babel

I'm trying to convert a web application into an electron app. I have multiple functions, in different files that I've imported into my main.js using a transpiler.
However, whenever I try do that in my electron app, I run into an issue with a module I'm using to move away from using php to access my database. Instead I'm using the mysql module on npm.
I want to save this function in its own file, and then require it in main.js. When I try to transpile it with babel, I get an error about Net.Connection not working (or something along those lines). As I understand it, this is because of how Node works. I'm happy to work around this, but I'm hoping there's a way to save this function in another file, and import it without having to use babel.
function loadColourFilter(){
var mysql = require('mysql');
let query_result;
var connection = mysql.createConnection({
host : 'xxxxxxxxxxxx',
user : 'xxxxxxxxxxxx',
password : 'xxxxxxxxxxxx',
database : 'xxxxxxxxxxxx'
});
connection.connect();
let query = "xxxxxxxxxxxxxxxx";
connection.query(query, function (error, results, fields) {
});
connection.end();
return (query_result);
}
EDIT: I've removed some parts of the function to keep credentials safe and whatnot. I'm fairly certain their absence won't change anything when trying to solve this.
EDIT:
My project directory is essentially
src
--- js
--- --- main.js
--- functionFile.js // This would be where my loadColourFilter function above would be saved
--- node_modules
--- --- ...
--- index.html // js/main.js is referenced in a script tag here.
--- main.js // Where the electron window is created.
--- package.json
There should be 2 js contexts, one running in the electron app and one running in node. You won't be able to require you scripts directly from your directory if you are in the electron context (which is like a browser js context).
I'm just assuming this is the case since we don't get a lot of information for your problem, and the other answer should have resolved your problem.
Try to include your js file in your index.html and see what's up.
Edit: Since it's a Transpiling error with babel, babel is probably transpiling for node when it should transpile for the browser.
You can easily make a simple local module using NodeJS by creating a source file and then adding a module.exports assignment to export some functionality/variables/etc from the file. In your case something like a file named colourFilter.js with the contents:
function load(){
var mysql = require('mysql');
let query_result;
var connection = mysql.createConnection({
host : 'xxxxxxxxxxxx',
user : 'xxxxxxxxxxxx',
password : 'xxxxxxxxxxxx',
database : 'xxxxxxxxxxxx'
});
connection.connect();
let query = "xxxxxxxxxxxxxxxx";
connection.query(query, function (error, results, fields) {
});
connection.end();
return (query_result);
}
module.exports = load
And then in your code where you'd like to use it include it by doing something like:
loadColourFilter = require('colourFilter.js')
And use the function like
let result = loadColourFilter()
This is a simple way to split up your code into multiple files/classes/modules but still keep one main file/class/module as the important one which is the public-facing portion or entry point. And of course you don't have to use the names I've used above :P
If you would like to make an object-style module you can instead export an object like
module.exports = {
load
}
Or
module.exports = {
load: loadFunctionNameInThisFile
}
And then use it like
const colourFilter = require('colourFilter.js')
let result = colourFilter.load()

How to add plugin and using some external module/file on RT

I'm having node.js application/module which is working OK with plug-in concept ,i.e.
My module is acting like proxy with additional capabilities such as adding new functionality to the out-of-the-box functionality(methods).
To do this you need to do the following:
clone my application
create new folder which is called extenders(inside my app)
In this folder you should provide two files
extend.js with your logic as functions/methods
extend.json which define your API (to know which file to invoke)
Note: the JS & JSON file name must be identical
for example lets assume that this is your extend.json file
{
"extenders": [
{
"path": "run",
"fn": "runFn"
},
}
In this case when the user put in the browser the following link
localhost:3000/run
Im invoking the runFn function (which exist in the extend.js file)with its logic and this is working as expected (under the hood I read the json & js files and invoke the function like extender[fnName](req, res));
Now I want to support the use case of adding external extender via code
for example that the user will do something like
var myModule = require('myModule');
myModule.extend('./pathTo/newExternalPathforExtendersFolder');
so when my module will run it search weather there is new external extenders exist with all configuration and if so refer to it in RT (to the js&json files).
My questions are:
I need to find when my module is starting who is register to my module and then do my logic on this module , how it can be done in node?
2.if there is other solution in node please let me know.
You could implement initialization function in your api to give freedom to module users. For example.
var yourModule = require('yourModule').init({
extenders: [
{
"path": "run",
"fn": "runFn"
}
]
});
yourModule.listen(3000);
Or as MattW wrote you can implement it like an express middleware, so module users could use it with their own server. For example:
var yourModule = require('yourModule').init({
extenders: [
{
"path": "run",
"fn": "runFn"
}
]
});
app = require('express')();
app.use(yourModule.getMiddleware());
Look at webpack-dev-server, webpack-dev-middleware like another example. Hope there's some similarity with your task. Webpack also deals with fs and configs. They've just splitted middleware and standalone server to separate modules. And there's no "working code", because we need your module's code, to talk about wrapper, which would depend on yourModule implementation. Just some thoughts.
If i'm not wrong understanding your problem maybe this approach can help you.
I think you could list your extenders in an ACL like JSON which not only include the path or the fnName, but the file_to_js path or any other property you need, like if it's active or security parameters.
extenders: [
{
"path": "run",
"fn": "runFn",
"file": "file_path"
"api-key": true,
"active": true
}
]
Then you can preload your modules reading ACL json and let them cached ready for extend.
var Router = {
extenders: {},
init: function () {
this.extenders = {};
this.loadExtenderFiles();
},
loadExtenderFiles: function () {
var key, extender;
// Iterate extender files
for (key in ACL_JSON) {
// extender load logic
extender = ACL_JSON[key];
if (extender.active) {
this.extenders[extender.fn] = require(extender.file);
}
}
},
// this fn should allow you to Router.extend() anywhere during the request process
extend: function (fn, request, response) {
// Parse request/response to match your extender module pattern
// extender process logic
this.extenders[fn](request, response);
}
};
module.exports = Router;
So Router.init() should do the cache work on server init;
Router.extend() should resolve your api request or extend one being processed.
Hope it helps you!
I believe a simple router should satisfy your requirements:
var userModule = require('userModule');
router.use('/run', function (req, res, next) {
return next(userModule(req));
}).all(yourReverseProxy);

Node attempting to read file before synchronous write finishes

I am trying to make my program such that it will attempt to read a config file, and if the config file doesn't exist, the program will generate a new config file from config.example.js and then require the newly generated file. However, I am running into an issue - even with using fs.writeFileSync(), it appears that Node is running config = require('config.js'); before the "synchronous" write finishes, as it crashes with Cannot find module './config.js'.
Here is the code in question:
var config;
//Create new config file if one not found
try {
config = require('./config.js');
} catch (e){
fs.writeFileSync('./config.js', fs.readFileSync('./config.js.example'));
console.log("New config file config.js created");
config = require('./config.js'); //Line it crashes on
}
This is because of the way caching works with require. Because the first one failed, so will the second one until the event loop clears. Try it this way instead.
if (fs.existsSync('/config.js')) {
config = require('./config');
} else {
fs.writeFileSync('./config.js', fs.readFileSync('./config.js.example'));
config = require('./config');
}

Switching Development/Test/Production variables in Javascript

I am trying to search the best approach to manage different values for same variables in Devlopment, Test and Production environment.
For example, I have variable jsonFile which can be:
var jsonFile = http://localhost:63342/json/appsconfig.json
for development env
var jsonFile = http://192.168.35.59/applications/json/appsconfig.json
for test env
var jsonFile = http://example.com/applications/json/appsconfig.json
for production env
I am trying to read a lot about Frontend Development Stack, but I am confused about what tool to use. I will use Google Closure Tools for minification, can it be also useful to switch variable values? Or can it be considered a Grunt task (even if I am not able to understand how to properly configure Grunt tasks...)?
What might be better is to write the JSON into a JS file that is part of your build artifacts. Something like file-creator that can write a file like so (using a simplistic setup that can obviously be made more dynamic).
In the top of your module.exports for grunt tasks, load in the config file into a var like:
var configData = grunt.file.readJSON('../config/appsconfig.json'),
Then write to a new JS file using the grunt file-creator module
"file-creator": {
'dev': {
'build/config.js': function (fs, fd, done) {
fs.writeSync(fd,
'var yourSiteHere = yourSiteHere || {}; yourSiteHere.config = '
+ JSON.stringify(configData) + ";"
);
done();
}
}
}
Then load this JS file into the page (perhaps even minify it using a separate grunt task). You will be then able to refer to the config data like so:
var apiEndPoint = yourSiteHere.config.api.apiEndPoint,
apiKey = yourSiteHere.config.api.apiKey;

Categories

Resources