I am trying to search the best approach to manage different values for same variables in Devlopment, Test and Production environment.
For example, I have variable jsonFile which can be:
var jsonFile = http://localhost:63342/json/appsconfig.json
for development env
var jsonFile = http://192.168.35.59/applications/json/appsconfig.json
for test env
var jsonFile = http://example.com/applications/json/appsconfig.json
for production env
I am trying to read a lot about Frontend Development Stack, but I am confused about what tool to use. I will use Google Closure Tools for minification, can it be also useful to switch variable values? Or can it be considered a Grunt task (even if I am not able to understand how to properly configure Grunt tasks...)?
What might be better is to write the JSON into a JS file that is part of your build artifacts. Something like file-creator that can write a file like so (using a simplistic setup that can obviously be made more dynamic).
In the top of your module.exports for grunt tasks, load in the config file into a var like:
var configData = grunt.file.readJSON('../config/appsconfig.json'),
Then write to a new JS file using the grunt file-creator module
"file-creator": {
'dev': {
'build/config.js': function (fs, fd, done) {
fs.writeSync(fd,
'var yourSiteHere = yourSiteHere || {}; yourSiteHere.config = '
+ JSON.stringify(configData) + ";"
);
done();
}
}
}
Then load this JS file into the page (perhaps even minify it using a separate grunt task). You will be then able to refer to the config data like so:
var apiEndPoint = yourSiteHere.config.api.apiEndPoint,
apiKey = yourSiteHere.config.api.apiKey;
Related
I'm trying to convert a web application into an electron app. I have multiple functions, in different files that I've imported into my main.js using a transpiler.
However, whenever I try do that in my electron app, I run into an issue with a module I'm using to move away from using php to access my database. Instead I'm using the mysql module on npm.
I want to save this function in its own file, and then require it in main.js. When I try to transpile it with babel, I get an error about Net.Connection not working (or something along those lines). As I understand it, this is because of how Node works. I'm happy to work around this, but I'm hoping there's a way to save this function in another file, and import it without having to use babel.
function loadColourFilter(){
var mysql = require('mysql');
let query_result;
var connection = mysql.createConnection({
host : 'xxxxxxxxxxxx',
user : 'xxxxxxxxxxxx',
password : 'xxxxxxxxxxxx',
database : 'xxxxxxxxxxxx'
});
connection.connect();
let query = "xxxxxxxxxxxxxxxx";
connection.query(query, function (error, results, fields) {
});
connection.end();
return (query_result);
}
EDIT: I've removed some parts of the function to keep credentials safe and whatnot. I'm fairly certain their absence won't change anything when trying to solve this.
EDIT:
My project directory is essentially
src
--- js
--- --- main.js
--- functionFile.js // This would be where my loadColourFilter function above would be saved
--- node_modules
--- --- ...
--- index.html // js/main.js is referenced in a script tag here.
--- main.js // Where the electron window is created.
--- package.json
There should be 2 js contexts, one running in the electron app and one running in node. You won't be able to require you scripts directly from your directory if you are in the electron context (which is like a browser js context).
I'm just assuming this is the case since we don't get a lot of information for your problem, and the other answer should have resolved your problem.
Try to include your js file in your index.html and see what's up.
Edit: Since it's a Transpiling error with babel, babel is probably transpiling for node when it should transpile for the browser.
You can easily make a simple local module using NodeJS by creating a source file and then adding a module.exports assignment to export some functionality/variables/etc from the file. In your case something like a file named colourFilter.js with the contents:
function load(){
var mysql = require('mysql');
let query_result;
var connection = mysql.createConnection({
host : 'xxxxxxxxxxxx',
user : 'xxxxxxxxxxxx',
password : 'xxxxxxxxxxxx',
database : 'xxxxxxxxxxxx'
});
connection.connect();
let query = "xxxxxxxxxxxxxxxx";
connection.query(query, function (error, results, fields) {
});
connection.end();
return (query_result);
}
module.exports = load
And then in your code where you'd like to use it include it by doing something like:
loadColourFilter = require('colourFilter.js')
And use the function like
let result = loadColourFilter()
This is a simple way to split up your code into multiple files/classes/modules but still keep one main file/class/module as the important one which is the public-facing portion or entry point. And of course you don't have to use the names I've used above :P
If you would like to make an object-style module you can instead export an object like
module.exports = {
load
}
Or
module.exports = {
load: loadFunctionNameInThisFile
}
And then use it like
const colourFilter = require('colourFilter.js')
let result = colourFilter.load()
So, I'm moving from grunt to gulp (or trying to anyway), and I'm having trouble getting gulp to do what I'm doing in grunt. Specifically the $templateCache stuff.
My angular app is broken up into several components/modules. Each module contains everything it needs to run (controllers, directives, partials, scss, etc.).
Using Grunt, I've been able to boil each module down into 5 files:
module.min.css // all module scss files compiled and concatenated
module.min.js // all module controllers, directives, services, etc. concatenated
module.tpls.min.js // all partials in $templateCache for this module
module.mocks.min.js // all unit test mock objects for this module
module.specs.min.js // all unit test specs for this module
This has worked really well for 2 years now and been a cornerstone of my modular architecture. My only reasons to try out gulp was 1) Curiosity, 2) My grunt file is getting kinda hairy as we add in deployment and environment specific stuff and so far gulp has really slimmed that down.
For the most part, I've figured out how to do all my grunt tasks in gulp, but I'm having trouble figuring out how to generate a template cache file for each module. All the gulp-ng|angular-templates|templatecache plugins take all my partials and create one file. I'd like to take all my files under module/partials/*.html and create a single module.tpls.min.js; and do that for each module.
This was actually a problem with grunt too, but I figured it out with grunt.file.expand().forEach() like this:
grunt.registerTask('prepModules', '...', function(){
// loop through our modules directory and create subtasks
// for each module, modifying tasks that affect modules.
grunt.file.expand("src/js/modules/*").forEach(function (dir) {
// get the module name by looking at the directory we're in
var mName = dir.substr(dir.lastIndexOf('/') + 1);
// add ngtemplate subtasks for each module, turning
// all module partials into $templateCache objects
ngtemplates[mName] = {
module: mName,
src: dir + "/partials/**/*.html",
dest: 'dev/modules/' + mName + '/' + mName + '.tpls.min.js'
};
grunt.config.set('ngtemplates', ngtemplates);
});
});
My current gulp for this same task:
var compileTemplates = gulp.src('./src/js/modules/**/partials/*.html', {base:'.'})
.pipe(ngTemplates())
.pipe(gulp.dest('.'));
I've only really looked at the options, but none of them seemed to do what I wanted. They were all around changing the file name, or the final destination of the file, or a module name, or whatever else; nothing that said anything about doing it for only the directory it happens to be in.
I had thought about using gulp-rename because it worked well for me when doing the CSS compilation:
var compileScss = gulp.src('./src/js/modules/**/scss/*.scss', {base:'.'})
.pipe(sass({includePaths: ['./src/scss']}))
.pipe(rename(function(path){
path.dirname = path.dirname.replace(/scss/,'css');
}))
.pipe(gulp.dest('.'));
However, when I pipe rename() after doing ngTemplates() it only has the path of the final output file (one log entry). When you console.log() path after sass(), it has all the paths of all the files that it found (lots of log entries).
Any ideas? Thanks!
This SO post has the correct answer, but the wasn't coming up in my searches for this specific usage. I was going to vote to close my question, but since someone else might search using my own specific terms (since I did), it seems more appropriate to leave it alone and just redirect to the original question as well as show how I solved my own particular problem.
var fs = require('fs');
var ngTemplates = require('gulp-ng-templates');
var rename = require('gulp-rename');
var modulesDir = './src/js/modules/';
var getModules = function(dir){
return fs.readdirSync(dir)
.filter(function(file){
return fs.statSync(path.join(dir, file)).isDirectory();
});
};
gulp.task('default', function(){
var modules = getModules(modulesDir);
var moduleTasks = modules.map(function(folder){
// get all partials for this module
// parse into $templateCache file
// rename to be /dev/modules/_____/______.tpls.min.js
return gulp.src(modulesDir + folder + '/partials/*.html', {basedir:'.'})
.pipe(ngTemplates({module:folder}))
.pipe(rename(function(path){
path.dirname = './dev/apps/' + folder + '/';
path.basename = folder + '.tpls.min';
}))
.pipe(gulp.dest('.'));
});
});
It's essentially like the tasks per folder recipe but with a change to use gulp-ng-templates. I'll probably be using this same pattern for my SCSS and JS now that I'm more aware of it.
Seems like the gulp equivalent of grunt.file.expand().forEach().
Whenever I deal with scss/sass for gulp tasks, I will only put one scss file as the source parameter. This parameter file is composed of a list of imports. This way you don't need to rely on gulp to concat the scss file contents for you.
//in gulpfile
gulp.src('./src/js/modules/**/scss/main.scss', {base:'.'})
//in main.scss
#import 'a', 'b', 'c';
a, b, and c would represent your other scss files.
I'm creating a commandline interface with node.js that runs an external script
> myapp build "path/to/script.js"
myapp is a node.js application that executes the script passed as a commandline argument.
To keep things brief, it basically does this:
var vm = require("vm");
var fs = require("fs");
var code = fs.readFileSync(scriptPath, { encoding: "utf8" }); // read "path/to/script.js"
var script = vm.createScript(code, scriptPath);
var context = vm.createContext({ require: require });
script.runInNewContext(context);
The contents of "path/to/script.js" look something like this:
var fs = require("fs"); // this works
var date = require("./date.js"); // supposed to load "path/to/date.js" but fails
My problem is that require() doesn't work properly in the external script. It works correctly for "native" modules like fs but fails on local files, probably because it doesn't know where to look up the modules.
I've considered to simply use following code:
require(scriptPath);
but then I can't inject my own script context.
I could write my own require function that emulates the built-in require but starts looking for modules in scriptPath but that seems a bit tedious...
I've used sandboxed-module to solve a similar problem.
var Sandbox = require('./sandboxed-module')
Sandbox.require(scriptPath);
Sandbox.require loads code from disk and runs it in a new context. It provides a sandbox-specfiic require function that allows you to tweak how require works.
My project has got many folders and I often load my own modules in node.js in the following way:
var request = require("request"),
config = require("../../../modules/config"),
urls = require("../../../modules/urls");
I sometimes move the folders around and the path changes, so I need to adjust the ../ part manually.
I don't want to move my modules into the node_modules folder, but I'd like to load the modules in the following way:
var request = require("request"),
config = require("config"),
urls = require("urls");
or
var request = require("request"),
config = require("modules/config"),
urls = require("modules/urls");
What are my options?
New Answer:
Relative paths seem to be the simplest choice after all, it allows you to run your script from any location.
var config = require("../../config");
Old answer:
I found out that, while not ideal, there's also the possibility to use process.cwd().
var request = require("request"),
config = require(process.cwd() + "/modules/config");
or, if the process.cwd() is set to a global variable in the main js file
global.cwd = process.cwd();
then
var request = require("request"),
config = require(global.cwd + "/modules/config"),
urls = require(global.cwd + "/modules/urls");
You can try to do the following, based on some conditions
if the scripts are exclusively written for your application, meaning it won't work with any other application, and the scripts don't have any dependencies place them under modules directory and try to create expose a variable such as global.basepath and using path.join to construct the filepath and require it.You could also inject module variable after you require them at the main script of your app.
main.js
var config = require('modules/config.js')(_,mongoose,app);
modules/config.js
module.exports=function(_,mongoose,app){
return function(){
// _,mongoose,app
this.loadConfigVariable..
}
}
if the scripts are not one-files that have separate tests, require other vendor modules in order to be executed then create individual node modules and publish them to a registry for convenience when installing the main application.
Just as express does, there is the main application express and there are modules that express uses such as express-validation which in turn has its own dependencies.
You could add a symlink in node_modules, pointing to wherever your modules are.
For example, add a symlink named "my_modules", referencing '../foo/bar/my_modules'. Then, your requires will look like
var config = require('my_modules/config');
var urls = require('my_modules/urls');
I am aware I can create a custom file inside the config directory and reference the variables from within that
module.exports.myconfig = {
foo: 'bar'
}
sails.config.myconfig.foo
But I need to write to these variables too and have them saved. In previous projects I have done this with JSON config files and used PHP to write to them.
Is there any way of doing this with Sails or should I just create some JSON files to pull and push my config vars?
There's no mechanism built in to Sails for persisting configuration variables. However, in the latest build of Sails there is a lower event you can listen for which indicates that Sails is exiting. You could catch this and persist your data then. For example, in your /config/bootstrap.js, something like:
var fs = require('fs');
module.exports = function(cb) {
sails.on('lower', function persistConfig() {
fs.writeFileSync(sails.appPath+'/config/myConfig.js',
'module.exports = ' + JSON.stringify(sails.config.myconfig));
});
// ... other bootstrap stuff ...
return cb();
}