I defined mysql connection with all parameters necessary to app.js, how can make visible to other scripts in routes/ by default, without requiring or redefining mysql parameters, just using client.query(..)?
A pattern I use is to set up my db object in a module once and export it: (let's call it utils/mySQL.js)
//I haven't used real mysql in node so excuse the pseudo-syntax:
var db = require('mysql-driver-thingy');
db.connect('localhost', 'sqlport', options...);
db.otherSetupFunctions();
console.log("Finished db setup. You should only see this message once! Cool.");
module.exports = db;
And then I can require the db object everywhere I need it. Since requires are cached, this does't actually call the setup methods multiple times.
In app.js:
var db = require('./utils/mySQL.js');
...
In models/user.js:
var db = require('../utils/mySQL.js');
...
A final option, which isn't recommended, is to pollute the global namespace. This seems to be the answer you're really after:
//set up your db
...
// and now make it available everywhere:
global.client = db.client
You can now magically use the client object in all your modules, without even requiring it.
There are many reasons globals are bad, though:
If your code and other code define globals, they could conflict and overwrite each other.
It's hard to find where you defined the db/client object, etc.
You can inject mysql connection into other scripts like this:
app.js
var mysqlConnection = new Conection(params);
require('controller/main.js)(mysqlConnection);
main.js
module.exports = function(mysqlConnection) {
// You can access your mysql connection here
};
UPDATE:
You can inject several variables same way. Also you still can export methods from module if you need this:
app.js
var mysqlConnection = new Conection(params);
var news = require('model/news.js)(app, mysqlConnection);
news.list(function(err, news) {
// Do something
});
news.js
module.exports = function(app, mysqlConnection) {
var methods = {};
// mysql connection and app available from here
methods.list = function(cb) {
mysqlConnection.list(function(err, data) {
cb(err, data);
});
};
return methods;
};
Related
I'm fairly new to nodejs. Writing my first application. I'm pretty used to php.
In order to keep code organized and clean, i always write functions in separate files and include them as required in php.
However, in nodejs i've had to require them like i would require a module.
For example.
functions.js
module.exports = {
check_db : function(key){
},
check_cache : function(key){
memcached.get(key,function(err, data){
console.log(data);
});
},
};
Included that in the main app like so
// Establish connection with cache and database
const mysql = require('mysql2');
const Memcached = require('memcached');
const memcached = new Memcached('localhost:11211');
const bb = require('bot-brother');
//Load the database cache functions
const dbc = require("./functions");
dbc.check_cache(123);
Now i can access the functions from dbc from the main app file, but i cannot use modules that have been required in the main app from the functions file.
I get an error that memcached is not defined.
How can i go about solving this?
Simple solution, you can require("memcached") in the functions.js file and create the server here. But I wouldn't go with this solution, as, if you need memcache somewhere else, you would have opened many connections on the memcache server.
Another, and cleaner solution IMO, is to inject the memcache dependency into your services (or functions as you call them). (this practice is called dependency injection if you want to learn about it and what are the benefits of it)
Here is how it would work:
you still create the memcache connection in the main file ;
instead of exporting a raw json object in your functions.js, you export a function that takes an argument (here memcache)
in your main file, you require that function and call it to get the service you want.
Here is what the code would look like:
main.js
//Load the database cache functions
const dbcFactory = require("./functions");
const dbc = dbcFactory(memcached)
functions.js
module.exports = function (memcached) {
return {
check_db : function(key){},
check_cache : function(key){
memcached.get(key,function(err, data){
console.log(data);
})
}
};
I have been creating software in NodeJS for years, but I have rarely ever looked into the module side of it, since I've never needed to do that stuff myself.
I have created a test.js file, which is my module, all it contains is this (i've tried this.host, self.host, var host ... nothing is working)
var tests = function () {
this.host = "http://127.0.0.1/";
};
exports = tests;
In my server.js I try and use it and I can never get host to actually output
var tests = require('./test.js');
console.log(tests);
console.log(tests.host);
I always get this output, saying that tests variable has no properties ... which I set in the module.
sudo node server.js
{}
undefined
The host variable as you defined it in the tests function is not accessible through tests's prototype.
This means that in order to access it, you should be creating a new instance of tests, using the new operator :
var Tests = require('./tests');
var instance = new Tests();
// Now you can access `instance.host`
Also, as David said, use module.exports to export your function.
Don't do exports = tests. Either do exports.tests = tests or module.exports = tests.
Basically, you have to first decide if you want your module to just have properties that can be directly accessed or if you want it to have a constructor function that creates an object when it is called that then has properties or it could even just be a regular function that you call that returns a value. You have mixed and matched the first two schemes (pieces of each) and thus it does not work. I will show you both schemes:
Here's the scheme where your module exports a constructor function from which you can create an object (when you new it):
// test.js module
var tests = function () {
this.host = "http://127.0.0.1/";
};
module.exports = tests;
// main module server.js
var Tests = require('./test.js');
var t = new Tests();
console.log(t.host);
And, here's the scheme where you just directly export properties:
// test.js module
module.exports = {
host: "http://127.0.0.1/"
};
// main module server.js
var tests = require('./test.js');
console.log(tests);
console.log(tests.host);
Keep in mind that whatever you assign to module.exports is what require() will return after it loads your module. So, in your first case, you're assigning a function that is intended to be a constructor function so you have to use it as a constructor function in order for it to work properly.
In my second example, I assign an object to module.exports so you can then treat it just like an object after loading the module with require(). That means you can then just directly access its properties as you would for an object.
console.log(tests()); will work if the you add return statement inside the function.
I have multiple routes that need to access a database, for development I use a local database, and obviously production I use a hosted database
The only problem is every time I go to push a release I have to go through each route manually changing the database link
e.g.
var mongodb = require('mongojs').connect('urlhere', ['Collection']);
It would be nice if I could declare a variable in app.js like
app.set('mongoDBAddress', 'urlhere');
then in each file do something like
var mongodb = require('mongojs').connect(app.get('mongoDBAddress'), ['Collection']);
Does anybody know if this is achievable I've been messing around with it for about an hour googling and trying to include different things but I have no luck. thanks.
From the docs:
In browsers, the top-level scope is the global scope. That means that
in browsers if you're in the global scope var something will define a
global variable. In Node this is different. The top-level scope is not
the global scope; var something inside a Node module will be local to
that module.
You have to think a bit differently. Instead of creating a global object, create your modules so they take an app instance, for example:
// add.js
module.exports = function(app) { // requires an `app`
return function add(x, y) { // the actual function to export
app.log(x + y) // use dependency
}
}
// index.js
var app = {log: console.log.bind(console)}
var add = require('./add.js')(app) // pass `app` as a dependency
add(1, 2)
//^ will log `3` to the console
This is the convention in Express, and other libraries. app is in your main file (ie. index.js), and the modules you require have an app parameter.
You can add a global variable to GLOBAL, see this this question, although this is probably considered bad practice.
We have two methods in node.js to share variables within modules.
global
module.export
But your problem seems to be different, what I got is you want to connect your application to different databases without changing code. What you need to do is use command line params
For more ref
server.js
var connectTo = {
dev : "url1"
production : "url2"
}
var mongodb = require('mongojs').connect(connectTo[process.argv[2]], ['Collection']);
Run your server.js as
node server.js dev
// for connecting to development database
or
node server.js production
// for connecting to prodiction database
To share connection across diffrent modules
//Method 1
global.mongodb = require('mongojs').connect(connectTo[process.argv[2]], ['Collection']);
//Method 2
exports.mongodb = require('mongojs').connect(connectTo[process.argv[2]], ['Collection']);
exports.getMongoDBAddress = function() {
return connectTo[process.argv[2]]
}
I have just started experimenting with building a website using node.js, and I am encountering an issue when organizing the models of my project.
All the real world examples I have found on the Internet are using Mongoose. This library allows you to define your models in a static way. So you can write this:
// models/foo.js
module.exports = require('mongoose').model('Foo', ...);
// app.js
mongoose.connect(...);
// some_controller_1.js
var Foo = require('./models/foo');
Foo.find(...);
// some_controller_2.js
var Foo = require('./models/foo');
Foo.find(...);
But since I don't want to use MongoDB, I need another ORM. And all the other ORMs I have found don't allow this. You first need to create an instance, and then only you can register your models. Also they don't seem to allow access to the list of registered models.
So I tried doing this:
// models/user.js
var registrations = [];
module.exports = function(sequelize) {
var result = null;
registrations.forEach(function(elem) {
if (elem.db == sequelize)
result = elem.value;
});
if (result) return result;
// data definition
var user = sequelize.define("User", ...);
registrations.push({ db: sequelize, value: user });
return user;
};
Which I can use this like:
// some_controller_1.js
var Foo = require('./models/foo')(app.get('database'));
Foo.find(...); // using Foo
But these small header and footer that I have to write on every single model file are a bit annoying and directly violate the "don't repeat youself" principle. Also, while not a huge issue, this is kind of a memory leak since the "sequelize" object will never be freed.
Is there a better way to do, which I didn't think about?
You can find an article about handling models with sequelize here: http://sequelizejs.com/articles/express#the-application
You basically just create a models/index.js as described here: http://sequelizejs.com/articles/express#block-3-line-0
Afterwards you just put your model definitions within files in the models folder as pointed out here: http://sequelizejs.com/articles/express#block-4-line-0
I'm new to node.js and in most code I've seen I don't see IoC/DI style constructor injection of dependencies.
Instead, typically the node.js extension require() is used to create local vars allowing access to external modules' exports.
But when writing unit-tests ( that isolate single layer/function ), how to mock the modules accessed via vars created with require ?
/helpers/dataHelper.js
var dataModel = require('../models/dataModel.js');
var getFormattedDataForRegion = function(region, callback) {
var data = {};
// validate region
// query dataModel
// async.map format data items
// callback(data);
}
/tests/dataHelperTests.js
describe('dataHelper', function(){
it('getFormattedDataForRegion returns expected response', function(done){
var expectedData = {};
// populate expectedData
// **** need some way to mock dataModel *****
dataHelper.getFormattedDataForRegion("west", function(data){
expect(data).to.eql(expectedData);
done();
});
});
This is done with proxyquire.
I personally don't like the technique, but it's the best way that I've found to respect the "node way" and still be able to test easily. You'd do:
var proxyquire = require('proxyquire'),
dataModelMock = require('./mocks/dataModel');
proxyquire('path/to/helpers/dataHelper.js', { '../models/dataModel.js': dataModelMock });