Use "require" in sails js app - javascript

When disabling globals the docs suggests using the following alternatives:
_ = require('lodash')
myService = sails.services.myservice
myModel = sails.models.mymodel
sails = req._sails
Would there be any issue requiring "sails", "services", and "models" much like any other module?
Having tried It, it does appear to work, however I feel I might be missing something.

Using require for services is always valid; the globalizing is merely for convenience.
On the other hand, doing require('api/models/User.js') will almost certainly not give you what you want, since those files are used by Sails to build model classes. So the only way to reliably use models in Sails without globals turned on is via sails.models.
Finally, while require('sails') will usually give you a reference to the running Sails app, it's not recommended that you use it that way. If you were running multiple Sails apps in the same process (which you might do in automated tests) then it would not reliably return the correct app. You're much better off using req._sails in controllers, and this.sails in models and services.

Related

MongoDB map-reduce (via nodejs): How to include complex modules (with dependencies) in scopeObj?

I'm working on a complicated map-reduce process for a mongodb database. I've split some of the more complex code off into modules, which I then make available to my map/reduce/finalize functions by including it in my scopeObj like so:
const scopeObj = {
userCalculations: require('../lib/userCalculations')
}
function myMapFn() {
let userScore = userCalculations.overallScoreForUser(this)
emit({
'Key': this.userGroup
}, {
'UserCount': 1,
'Score': userScore
})
}
function myReduceFn(key, objArr) { /*...*/ }
db.collection('userdocs').mapReduce(
myMapFn,
myReduceFn,
{
scope: scopeObj,
query: {},
out: {
merge: 'userstats'
}
},
function (err, stats) {
return cb(err, stats);
}
)
...This all works fine. I had until recently thought it wasn't possible to include module code into a map-reduce scopeObj, but it turns out that was just because the modules I was trying to include all had dependencies on other modules. Completely standalone modules appear to work just fine.
Which brings me (finally) to my question. How can I -- or, for that matter, should I -- incorporate more complex modules, including things I've pulled from npm, into my map-reduce code? One thought I had was using Browserify or something similar to pull all my dependencies into a single file, then include it somehow... but I'm not sure what the right way to do that would be. And I'm also not sure of the extent to which I'm risking severely bloating my map-reduce code, which (for obvious reasons) has got to be efficient.
Does anyone have experience doing something like this? How did it work out, if at all? Am I going down a bad path here?
UPDATE: A clarification of what the issue is I'm trying to overcome:
In the above code, require('../lib/userCalculations') is executed by Node -- it reads in the file ../lib/userCalculations.js and assigns the contents of that file's module.exports object to scopeObj.userCalculations. But let's say there's a call to require(...) somewhere within userCalculations.js. That call isn't actually executed yet. So, when I try to call userCalculations.overallScoreForUser() within the Map function, MongoDB attempts to execute the require function. And require isn't defined on mongo.
Browserify, for example, deals with this by compiling all the code from all the required modules into a single javascript file with no require calls, so it can be run in the browser. But that doesn't exactly work here, because I need to be the resulting code to itself be a module that I can use like I use userCalculations in the code sample. Maybe there's a weird way to run browserify that I'm not aware of? Or some other tool that just "flattens" a whole hierarchy of modules into a single module?
Hopefully that clarifies a bit.
As a generic response, the answer to your question: How can I -- or, for that matter, should I -- incorporate more complex modules, including things I've pulled from npm, into my map-reduce code? - is no, you can not safely include complex modules in node code you plan to send to MongoDB for mapReduce jobs.
You mentioned the problem yourself - nested require statements. Now, require is sync, but if you have nested functions inside, these require calls would not be executed until call time, and MongoDB VM would throw at this point.
Consider the following example of three files: data.json, dep.js and main.js.
// data.json - just something we require "lazily"
false
// dep.js -- equivalent of your userCalculations
module.exports = {
isValueTrue() {
// The problem: nested require
return require('./data.json');
}
}
// main.js - from here you send your mapReduce to MongoDB.
// require dependency instantly
const calc = require('./dep.js');
// require is synchronous, the effectis the same if you do:
// const calc = (function () {return require('./dep.js')})();
console.log('Calc is loaded.');
// Let's mess with unwary devs
require('fs').writeFileSync('./data.json', 'false');
// Is calc.isValueTrue() true or false here?
console.log(calc.isValueTrue());
As a general solution, this is not feasible. While vast majority of modules will likely not have nested require statements, HTTP calls, or even internal, service calls, global variables and similar, there are those who do. You cannot guarantee that this would work.
Now, as a your local implementation: e.g. you require exactly specific versions of NPM modules that you have tested well with this technique and you know it will work, or you published them yourself, it is somewhat feasible.
However, even if this case, if this is a team effort, there's bound to be a developer down the line who will not know where your dependency is used or how, use globals (not on purpose, but by ommission, e.g they wrongly calculate this) or simply not know the implications of whatever they are doing. If you have strong integration testing suite, you could guard against this, but the thing is, it's unpredictable. Personally I think that when you can choose between unpredictable and predictable, almost always you should use predictable.
Now, if you have an explicitly stated purpose for a certain library to be used in MongoDB mapReduce, this would work. You would have to guard well against ommisions and problems, and have strong testing infra, but I would make certain the purpose is explicit before feeling safe enough to do this. But of course, if you're using something that is so complex that you need several npm packages to do, maybe you can have those functions directly on MongoDB server, maybe you can do your mapReducing in something better suited for the purpose, or similar.
To conclude: As a purposefuly built library with explicit mission statement that it is to be used with node and MongoDB mapReduce, I would make sure my tests cover all my mission-critical and important functionality, and then import such npm package. Otherwise I would not use nor recommend this approach.

Dynamic loading of modules and components at runtime in Angular 4

I've been looking to develop a method for loading modules and/or components into an AOT-compiled Angular 4 application and been stymied by a variety of solutions that never quite seem to get me where I want to be.
My requirements are as such:
My main application is AOT compiled, and has no knowledge of what it is loading until runtime, so I cannot specifically identify my dynamic module as an entry component at compile time (which is explicitly necessary for the 'dynamic' component loading example presented on Angular.io)
I'd ideally love to be able to pull the code from a back end database via a GET request, but I can survive it simply living in a folder alongside the compiled site.
I'm using Webpack to compile my main application, breaking it into chunks - and so a lot of the SystemJS based solutions seem like dead ends - based on my current research, I could be wrong about this.
I don't need to know or have access to any components of my main application directly - in essence, I'd be loading one angular app into another, with the dynamically loaded module only perhaps having a few tightly controlled explicit interface points with the parent application.
I've explored using tools like SystemJsNgModuleLoader - which seems to require that I have the Angular compiler present, which I'm happy to do if AOT somehow allowed me to include it even if I'm not using it elsewhere. I've also looked into directly compiling my dynamic module using ngc and loading the resulting ngfactory and compiled component/module, but I'm not clear if this is at all possible or if so - what tools Angular makes available to do so. I have also seen references to ANALYZE_FOR_ENTRY_COMPONENTS - but can't clearly dig up what the limitations of this are, as first analysis indicates its not quite what I'm looking for either.
I had assumed I might be able to define a common interface and then simply make a get request to bring my dynamic component into my application - but Angular seems painfully allergic to anything I try to do short of stepping outside of it alltogether and trying to attach non-angular code to the DOM directly.
Is what I'm trying to do even possible? Does Angular 2+ simply despise this kind of on the fly modification of its internal application architecture?
I think I found an article that describes exactly what you are trying to do. In short you need to take over the bootstrap lifecycle.
The magic is in this snippet here.
import {AComponentNgFactory, BComponentNgFactory} from './components.ngfactory.ts';
#NgModule({
imports: [BrowserModule],
declarations: [AComponent, BComponent]
})
export class AppModule {
ngDoBootstrap(app) {
fetch('url/to/fetch/component/name')
.then((name)=>{ this.bootstrapRootComponent(app, name)});
}
bootstrapRootComponent(app, name) {
const options = {
'a-comp': AComponentNgFactory,
'b-comp': BComponentNgFactory
};
https://blog.angularindepth.com/how-to-manually-bootstrap-an-angular-application-9a36ccf86429

dynamically load modules in Meteor node.js

I'm trying to load multiple modules on the fly via chokidar (watchdog) using Meteor 1.6 beta, however after doing extensive research on the matter I just can't seem to get it to work.
From what I gather require by design will not take in anything other than static strings, i.e.
require("test/string/here")
Since if I try:
var path = "test/string/here"
require(path)
I just get Error: Cannot find module, even though the strings are identical.
Now the thing is I'm uncertain how to go on about this, am I really forced to either use import or static strings when using meteor or is there some workaround this?
watchdog(cmddir, (dir) => {
match = "." + regex_cmd.exec(dir);
match = dir;
loader.emit("loadcommand", match)
});
loader.on('loadcommand', (file) => {
require(file);
});
There is something intrinsically weird in what you describe.
chokidar is used to watch actual files and folders.
But Meteor compiles and bundles your code, resulting in an app folder after build that is totally different from your project structure.
Although Meteor now supports dynamic imports, the mechanism is internal to Meteor and does not rely on your actual project files, but on Meteor built ones.
If you want to dynamically require files like in Node, including with dynamically generated module path, you should avoid import and require statements, which are automatically replaced by Meteor built-in import mechanism. Instead you would have to make up your own loading function, taking care of the fact that your app built folder is different from your project folder.
That may work for example if your server is watching files and/or folders in a static location, different from where your app will be running.
In the end, I feel this is a sort of XY problem: you have not described your objective in the first place, and the above issue is trying to solve a weird solution that does not seem to fit how Meteor works, hence which may not be the most appropriate solution for your implicit objective.
#Sashko does a great job of explaining Meteor's dynamic imports here. There are also docs
A dynamic import is a function that returns a promise instead of just importing statically at build time. Example:
import('./component').then((MyComponent) => {
render(MyComponent);
});
The promise runs once the module has been loaded. If you try to load the module repeatedly then it only gets loaded once and is immediately available on subsequent requests.
afaict you can use a variable for the string to import.

Should i really repeat all this requirement in every route module file?

I'm building a larger web app, and have now begun to realise the use of modularising my routes in its own files. But when doing this I notice I have to reapeat alot of requirements... Before i started to move out the routes to its own file I had around 20 required modules in my main app, handling everything from DB to emails...
Many of these modules are used in most of the routes... which mean I have to repeat maybe 15-20 requirements in each route module file.
Question: This seems like a alot of repeated code, but maybe this is the righ way to do it?
At least official NPM modules seems to work in this way.
You may write a module (lets say, common.js), which would require all of your requirements and return a single object:
module.exports = {
http: require('http'),
request: require('request'),
async: require('async'),
someModule: require('./someModule')
};
Then all you have to do is require a single common module:
var common = require('./common');
common.request.get(...);
common.async.parallel(...);
The only inconvenience is that you now have to write common. when you want to access these modules.
You can also use global variables. Actually using globals is a bad practice, and it's strongly recommended that you don't use them:
Why are global variables considered bad practice? (node.js)
Why are globals bad?

How to manage multiple JS files server-side with Node.js

I'm working on a project with Node.js and the server-side code is becoming large enough that I would like to split it off into multiple files. It appears this has been done client-side for ages, development is done by inserting a script tag for each file and only for distribution is something like "Make" used to put everything together. I realize there's no point in concatting all the server-side code so I'm not asking how to do that. The closest thing I can find to use is require(), however it doesn't behave quite like script does in the browser in that require'd files do not share a common namespace.
Looking at some older Node.js projects, like Shooter, it appears this was once not the case, that or I'm missing something really simple in my code. My require'd files cannot access the global calling namespace at compile time nor run time. Is there any simple way around this or are we forced to make all our require'd JS files completely autonomous from the calling scope?
You do not want a common namespace because globals are evil. In node we define modules
// someThings.js
(function() {
var someThings = ...;
...
module.exports.getSomeThings = function() {
return someThings();
}
}());
// main.js
var things = require("someThings");
...
doSomething(things.getSomeThings());
You define a module and then expose a public API for your module by writing to exports.
The best way to handle this is dependency injection. Your module exposes an init function and you pass an object hash of dependencies into your module.
If you really insist on accessing global scope then you can access that through global. Every file can write and read to the global object. Again you do not want to use globals.
re #Raynos answer, if the module file is next to the file that includes it, it should be
var things = require("./someThings");
If the module is published on, and installed through, npm, or explicitly put into the ./node_modules/ folder, then the
var things = require("someThings");
is correct.

Categories

Resources