How to setup Karma + Jasmine to use ES6 modules - javascript

I'm a bit stuck on this. I have a complex stack composed of middleman, karma, jasmine, babeljs to build a static website.
Considering this is an experiment, I wanted to use ES6 with modules. Everything fine on middleman side, though, I'm having hard times setting up karma + jasmine for testing.
The main problem lies within babel: if you set it to use modules: "ignore" you have to manually use System.import for all your modules in your specs, which is something I don't want. I would like to use the ES6 syntax, but if I set modules: "system", babeljs wraps all my tests into System.register, with something like the following:
System.register(["mymodule"], function (_export) {
"use strict";
var Mymodule;
return {
setters: [function (_mymodule) {
Mymodule = _mymodule["default"];
}],
execute: function () {
console.log("I'm here!!!");
console.log(Mymodule);
describe("Mymodule", function () {
it("has version", function () {
expect(Mymodule.VERSION).toEqual("1.0.0");
});
});
}
};
});
So the tests are not automatically executed. I then created the following script to work around it (which is included after all specs are included):
basePath = "/base/spec/"
modules = []
for own fileName, fileHash of window.__karma__.files
if fileName.indexOf(basePath) is 0
isRunner = fileName.indexOf("spec_runner") >= 0
isRunner ||= fileName.indexOf("spec_helper") >= 0
unless isRunner
moduleName = fileName.replace(basePath, "")
moduleName = moduleName.replace(".js", "")
modules.push(path: fileName, name: moduleName)
mappedModules = {}
baseSpecsPath = "http://localhost:9876"
for module in modules
mappedModules[module.name] = baseSpecsPath + module.path
System.config
baseURL: "http://localhost:4567/javascripts/"
map: mappedModules
for module in modules
System.import(module.name)
This code is simple: it prepares the map configuration for SystemJS, I can correctly load modules from my app (located in http://localhost:4567) and tests wrapped in System.register (located in http://localhost:9876).
However, my tests are not run, and no error reported. Even worse, I correctly get the message logged "I'm here!!!" and Mymodule is logged in console correctly. I even tried to log the value of describe and it's correctly a Suite object. So, why on earth my tests are not run? (The it block is never run)
What solutions do I have? I'm ok with changing the setup a bit to get it working, but I want to keep the following things: Middleman, ES6 modules, no dynamic module loading (all my modules are exposed, in the end, in a single file or required with a bunch of <script> tags), jasmine

I finally solved the issue. I include this file as last one:
basePath = "/base/spec/"
modules = []
for own fileName, fileHash of window.__karma__.files
if fileName.indexOf(basePath) is 0
isRunner = fileName.indexOf("spec_runner") >= 0
isRunner ||= fileName.indexOf("spec_helper") >= 0
unless isRunner
moduleName = fileName.replace(basePath, "")
moduleName = moduleName.replace(".js", "")
modules.push(path: fileName, name: moduleName)
mappedModules = {}
baseSpecsPath = "http://localhost:9876"
specModules = []
for module in modules
mappedModules[module.name] = baseSpecsPath + module.path
specModules.push(module.name)
System.config
baseURL: "http://localhost:4567/javascripts/"
map: mappedModules
moduleImports = specModules.map (moduleName) ->
System.import(moduleName)
Promise.all(moduleImports).then ->
window.__karma__.start = window.__lastKarmaStart__
window.__lastKarmaStart__ = null
delete window.__lastKarmaStart__
window.__karma__.start()
It does the following things:
Temporary disable karma start function replacing it with an empty one
Fetches each test stored in a System.register, run System.import for all tests (waits them with Promise.all)
After all imports are completed, it attaches back __karma__start and executes it, running Jasmine

Related

module.exports not available on dependency (import)

Recently upgraded a project to Gatsby 3 whose dependency is Webpack 5. In one of the .tsx classes, an import to the library countdown is done. The import returns an empty object every time, {}.
Looking at the code from "countdown" library I see they export the module like this:
/*global window */
var module;
var countdown = (
function(module) {
'use strict';
...
if (module && module.exports) {
module.exports = countdown;
} else if (typeof window.define === 'function' && typeof window.define.amd !== 'undefined') {
window.define('countdown', [], function() {
return countdown;
});
}
return countdown;
})(module);
Using console.log inside the library I see that module.exports is actually undefined, it seems that var module; is overriding whatever value Node makes available when the import is called. To test the hypothesis out I removed the var module; and removed it as an argument to the function, and everything worked. Of course, that's not the answer to my problem since this is a dependency library, I have no right to touch its code.
I can't figure out what the upgrade to Webpack 5 could have broken to not make module.exports available to countdown.js even though they declare the variable.
I looked at webpack updates and tried to tell it to use commonjs or import-loader on countdown.js and pass it "module" (different test cases have been commented out):
module: {
rules: [
{
test: /\countdown.js?$/,
use: {
loader: 'babel-loader',
},
// loader: "imports-loader",
// options: {
// syntax: "default",
// type: "commonjs"
// },
// use: [
// {
// loader: "imports-loader",
// options: {
// thisArg: "module"
// },
// },
// ],
},
],
}
None of that work. I can't quite figure out what changed in Webpack 5 to cause countdown's way of exporting the library to break.
Any ideas?
To make it clear: the library countdown is not mine, and when using previous webpack version it worked great when importing and using it.
Turns out the package has been updated but version update hasn't been propagated. There's a request soliciting that.
In the meantime a colleague had a solution, thought I'd share it here in case anyone can use it until version is updated. In summary it modifies the script from the library after installation is done, it removes the declared global variable that is overriding the one that Node is supposed to pass.
In a file called yarn-postinstall.js (or whatever you please):
fixCountdownLib();
function fixCountdownLib() {
const fs = require("fs");
const countdownLibPath = __dirname + "/node_modules/countdown/countdown.js";
const countdownLibCode = fs.readFileSync(countdownLibPath).toString();
const fixedCountdownLibCode = countdownLibCode.replace("var module;", "");
fs.writeFileSync(countdownLibPath, fixedCountdownLibCode);
}
Then update package.json scripts to point to that file.
"scripts": {
"postinstall": "node yarn-postinstall.js"
},
The module variable isn't an object, and doesn't have a property exports.
Changing var module; to var module = {exports: {}} should work. But I'm not sure you should be checking for module.exports, since it's supposed to be undefined anyways, I guess?

How to tell which files are being transpiled by Babel 6?

I have a project that is using babel-register to dynamically transpile ES6 source to ES5 when requiring that module in a Node 6.6 project. I've read that babel-register hooks into Node's require function in order to transpile a file when you try to load it, but I'm not always clear on which files will be affected by that change.
This question comes up for me a lot when I'm writing tests: is only my production code getting transpiled, or does the test code get transpiled too?This brings me to the more general question, which is the topic of this post:
How can I tell when Babel is actually running, and which files are being transpiled?
Example code
Let's say I have production classes like this that are written in ES6 syntax
//src/greeter.js
export default class Greeter {
sayHello() {
return 'Hello World';
}
}
and Babel is configured to transpile as so (.babelrc)
{
"presets": ["es2015"]
}
and then there's some test code
//features/step_definitions/greeter_steps.js
import Greeter from '../../src/greeter'; //Causes greeter.js to be transpiled
import expect from 'expect';
var stepWrapper = function() {
//Does Babel try to transpile this code too?
this.Given(/^a greeter$/, function() {
this.greeter = new Greeter();
});
this.When(/^I ask it for a general greeting$/, function() {
this.greeting = this.greeter.sayHello();
});
this.Then(/^it should greet the entire world$/, function() {
expect(this.greeting).toEqual('Hello World');
});
};
module.exports = stepWrapper;
and all of that runs on Node like so
cucumberjs --compiler js:babel-core/register
Example code is available here, if that is helpful.
I made a hack to node_modules/babel-register/lib/node.js to do some logging like so
function compile(filename) {
var result = void 0;
var opts = new _babelCore.OptionManager().init((0, _extend2.default)({ sourceRoot: _path2.default.dirname(filename) }, (0, _cloneDeep2.default)(transformOpts), { filename: filename }));
var cacheKey = (0, _stringify2.default)(opts) + ":" + babel.version;
var env = process.env.BABEL_ENV || process.env.NODE_ENV;
console.log('[babel-register::compile] filename=' + filename + '\n'); //Added logging here
if (env) cacheKey += ":" + env;
if (cache) {
var cached = cache[cacheKey];
if (cached && cached.mtime === mtime(filename)) {
result = cached;
}
}
...
}
which then reports that test and production code are at least passing through Babel on some level
$ npm t
> cucumber-js-babel#1.0.0 test /Users/krull/git/sandbox/node/cucumber-js-babel
> cucumberjs --compiler js:babel-core/register
[babel-register::compile] filename=.../node/cucumber-js-babel/features/step_definitions/greeter_steps.js
[babel-register::compile] filename=.../node/cucumber-js-babel/src/greeter.js
...test results...
However, I'm hoping for a better solution that
works by some means of plugins and/or configuration, instead of monkey patching
better distinguishes which files are actually being transpiled, and which ones pass through Babel without modification
Because of this:
cucumberjs --compiler js:babel-core/register
...babel is invoked for both your test and regular source code. Keep in mind that in node, the only way to import JS is through require, so obviously babel-register will always be invoked. Of course, what babel does depends on its configuration, but most likely you have a simple configuration where all files required by require except those under node_modules will be transpiled.

Packaging-up Browser/Server CommonJS modules with dependancies

Lets say I'm writing a module in JavaScript which can be used on both the browser and the server (with Node). Lets call it Module. And lets say that that Module would benefit from methods in another module called Dependancy. Both of these modules have been written to be used by both the browser and the server, à la CommonJS style:
module.js
if (typeof module !== 'undefined' && module.exports)
module.exports = Module; /* server */
else
this.Module = Module; /* browser */
dependancy.js
if (typeof module !== 'undefined' && module.exports)
module.exports = Dependancy; /* server */
else
this.Dependancy = Dependancy; /* browser */
Obviously, Dependancy can be used straight-out-of-the-box in a browser. But if Module contains a var dependancy = require('dependency'); directive in it, it becomes more of a hassle to 'maintain' the module.
I know that I could perform a global check for Dependancy within Module, like this:
var dependancy = this.Dependancy || require('dependancy');
But that means my Module has two added requirements for browser installation:
the user must include the dependency.js file as a <script> in their document
and the user must make sure this script is loaded before module.js
Adding those two requirements throws the idea of an easy-going modular framework like CommonJS.
The other option for me is that I include a second, compiled script in my Module package with the dependency.js bundled using browserify. I then instruct users who are using the script in the browser to include this script, while server-side users use the un-bundled entry script outlined in the package.json. This is preferable to the first way, but it requires a pre-compilation process which I would have to run every time I changed the library (for example, before uploading to GitHub).
Is there any other way of doing this that I haven't thought of?
The two answers currently given are both very useful, and have helped me to arrive at my current solution. But, as per my comments, they don't quite satisfy my particular requirements of both portability vs ease-of-use (both for the client and the module maintainer).
What I found, in the end, was a particular flag in the browserify command line interface that can bundle the modules and expose them as global variables AND be used within RequireJS (if needed). Browserify (and others) call this Universal Module Definition (UMD). Some more about that here.
By passing the --standalone flag in a browserify command, I can set my module up for UMD easily.
So...
Here's the package.js for Module:
{
"name": "module",
"version": "0.0.1",
"description": "My module that requires another module (dependancy)",
"main": "index.js",
"scripts": {
"bundle": "browserify --standalone module index.js > module.js"
},
"author": "shennan",
"devDependencies": {
"dependancy": "*",
"browserify": "*"
}
}
So, when at the root of my module, I can run this in the command line:
$ npm run-script bundle
Which bundles up the dependancies into one file, and exposes them as per the UMD methodology. This means I can bootstrap the module in three different ways:
NodeJS
var Module = require('module');
/* use Module */
Browser Vanilla
<script src="module.js"></script>
<script>
var Module = module;
/* use Module */
</script>
Browser with RequireJS
<script src="require.js"></script>
<script>
requirejs(['module.js'], function (Module) {
/* use Module */
});
</script>
Thanks again for everybody's input. All of the answers are valid and I encourage everyone to try them all as different use-cases will require different solutions.
Of course you could use the same module with dependency on both sides. You just need to specify it better. This is the way I use:
(function (name, definition){
if (typeof define === 'function'){ // AMD
define(definition);
} else if (typeof module !== 'undefined' && module.exports) { // Node.js
module.exports = definition();
} else { // Browser
var theModule = definition(), global = this, old = global[name];
theModule.noConflict = function () {
global[name] = old;
return theModule;
};
global[name] = theModule;
}
})('Dependency', function () {
// return the module's API
return {
'key': 'value'
};
});
This is just a very basic sample - you can return function, instantiate function or do whatever you like. In my case I'm returning an object.
Now let's say this is the Dependency class. Your Module class should look pretty much the same, but it should have a dependency to Dependency like:
function (require, exports, module) {
var dependency = require('Dependency');
}
In RequireJS this is called Simplified CommonJS Wrapper: http://requirejs.org/docs/api.html#cjsmodule
Because there is a require statement at the beginning of your code, it will be matched as a dependency and therefore it will either be lazy loaded or if you optimize it - marked as a dependency early on (it will convert define(definition) to define(['Dependency'], definition) automatically).
The only problem here is to keep the same path to the files. Keep in mind that nested requires (if-else) won't work in Require (read the docs), so I had to do something like:
var dependency;
try {
dependency = require('./Dependency'); // node module in the same folder
} catch(err) { // it's not node
try {
dependency = require('Dependency'); // requirejs
} catch(err) { }
}
This worked perfectly for me. It's a bit tricky with all those paths, but at the end of the day, you get your two separate modules in different files which can be used on both ends without any kind of checks or hacks - they have all their dependencies are work like a charm :)
Take a look at webpack bundler.
You can write module and export it via module exports. Then You can in server use that where you have module.export and for browser build with webpack. Configuration file usage would be the best option
module.exports = {
entry: "./myModule",
output: {
path: "dist",
filename: "myModule.js",
library: "myModule",
libraryTarget: "var"
}
};
This will take myModule and export it to myModule.js file. Inside module will be assigned to var (libraryTarget flag) named myModule (library flag).
It can be exported as commonJS module, war, this, function
Since bundling is node script, this flag values can be grammatically set.
Take a look at externals flag. It is used if you want to have special behavior for some dependencies. For example you are creating react component and in your module you want to require it but not when you are bundling for web because it already is there.
Hope it is what you are looking for.

Why do I see "define not defined" when running a Mocha test with RequireJS?

I am trying to understand how to develop stand-alone Javascript code. I want to write Javscript code with tests and modules, running from the command line. So I have installed node.js and npm along with the libraries requirejs, underscore, and mocha.
My directory structure looks like this:
> tree .
.
├── node_modules
├── src
│   └── utils.js
└── test
└── utils.js
where src/utils.js is a little module that I am writing, with the following code:
> cat src/utils.js
define(['underscore'], function () {
"use strict";
if ('function' !== typeof Object.beget) {
Object.beget = function (o) {
var f = function () {
};
f.prototype = o;
return new f();
};
}
});
and test/utils.js is the test:
> cat test/utils.js
var requirejs = require('requirejs');
requirejs.config({nodeRequire: require});
requirejs(['../src/utils'], function(utils) {
suite('utils', function() {
test('should always work', function() {
assert.equal(1, 1);
})
})
});
which I then try to run from the top level directory (so mocha sees the test directory):
> mocha
node.js:201
throw e; // process.nextTick error, or 'error' event on first tick
^
Error: Calling node's require("../src/utils") failed with error: ReferenceError: define is not defined
at /.../node_modules/requirejs/bin/r.js:2276:27
at Function.execCb (/.../node_modules/requirejs/bin/r.js:1872:25)
at execManager (/.../node_modules/requirejs/bin/r.js:541:31)
...
So my questions are:
Is this the correct way to structure code?
Why is my test not running?
What is the best way to learn this kind of thing? I am having a hard time finding good examples with Google.
Thanks...
[sorry - momentarily posted results from wrong code; fixed now]
PS I am using requirejs because I also want to run this code (or some of it) from a browser, later.
Update / Solution
Something that is not in the answers below is that I needed to use mocha -u tdd for the test style above. Here is the final test (which also requires assert) and its use:
> cat test/utils.js
var requirejs = require('requirejs');
requirejs.config({nodeRequire: require});
requirejs(['../src/utils', 'assert'], function(utils, assert) {
suite('utils', function() {
test('should always work', function() {
assert.equal(1, 1);
})
})
});
> mocha -u tdd
.
✔ 1 tests complete (1ms)
The reason your test isn't running is because src/utils.js is not a valid Node.js library.
According to the RequireJS documentation, in order to co-exist with Node.js and the CommonJS require standard, you need to add a bit of boilerplate to the top of your src/utils.js file so RequireJS's define function is loaded.
However, since RequireJS was designed to be able to require "classic" web browser-oriented source code, I tend to use the following pattern with my Node.js libraries that I also want running in the browser:
if(typeof require != 'undefined') {
// Require server-side-specific modules
}
// Insert code here
if(typeof module != 'undefined') {
module.exports = whateverImExporting;
}
This has the advantage of not requiring an extra library for other Node.js users and generally works well with RequireJS on the client.
Once you get your code running in Node.js, you can start testing. I personally still prefer expresso over mocha, even though its the successor test framework.
The Mocha documentation is lacking on how to set this stuff up, and it's perplexing to figure out because of all the magic tricks it does under the hood.
I found the keys to getting browser files using require.js to work in Mocha under Node: Mocha has to have the files added to its suites with addFile:
mocha.addFile('lib/tests/Main_spec_node');
And second, use beforeEach with the optional callback to load your modules asynchronously:
describe('Testing "Other"', function(done){
var Other;
beforeEach(function(done){
requirejs(['lib/Other'], function(_File){
Other = _File;
done(); // #1 Other Suite will run after this is called
});
});
describe('#1 Other Suite:', function(){
it('Other.test', function(){
chai.expect(Other.test).to.equal(true);
});
});
});
I created a bootstrap for how to get this all working: https://github.com/clubajax/mocha-bootstrap
You are trying to run JS modules designed for browsers (AMD), but in the backend it might not work (as modules are loaded the commonjs way). Because of this, you will face two issues:
define is not defined
0 tests run
In the browserdefine will be defined. It will be set when you require something with requirejs. But nodejs loads modules the commonjs way. define in this case is not defined. But it will be defined when we require with requirejs!
This means that now we are requiring code asynchronously, and it brings the second problem, a problem with async execution.
https://github.com/mochajs/mocha/issues/362
Here is a full working example.
Look that I had to configure requirejs (amd) to load the modules, we are not using require (node/commonjs) to load our modules.
> cat $PROJECT_HOME/test/test.js
var requirejs = require('requirejs');
var path = require('path')
var project_directory = path.resolve(__dirname, '..')
requirejs.config({
nodeRequire: require,
paths: {
'widget': project_directory + '/src/js/some/widget'
}
});
describe("Mocha needs one test in order to wait on requirejs tests", function() {
it('should wait for other tests', function(){
require('assert').ok(true);
});
});
requirejs(['widget/viewModel', 'assert'], function(model, assert){
describe('MyViewModel', function() {
it("should be 4 when 2", function () {
assert.equal(model.square(2),4)
})
});
})
And for the module that you want to test:
> cat $PROJECT_HOME/src/js/some/widget/viewModel.js
define(["knockout"], function (ko) {
function VideModel() {
var self = this;
self.square = function(n){
return n*n;
}
}
return new VideModel();
})
Just in case David's answer was not clear enough, I just needed to add this:
if (typeof define !== 'function') {
var define = require('amdefine')(module);
}
To the top of the js file where I use define, as described in RequireJS docs ("Building node modules with AMD or RequireJS") and in the same folder add the amdefine package:
npm install amdefine
This creates the node_modules folder with the amdefine module inside.
I don't use requirejs so I'm not sure what that syntax looks like, but this is what I do to run code both within node and the browser:
For imports, determine if we are running in node or the browser:
var root = typeof exports !== "undefined" && exports !== null ? exports : window;
Then we can grab any dependencies correctly (they will either be available already if in the browser or we use require):
var foo = root.foo;
if (!foo && (typeof require !== 'undefined')) {
foo = require('./foo');
}
var Bar = function() {
// do something with foo
}
And then any functionality that needs to be used by other files, we export it to root:
root.bar = Bar;
As for examples, GitHub is a great source. Just go and check out the code for your favorite library to see how they did it :) I used mocha to test a javascript library that can be used in both the browser and node. The code is available at https://github.com/bunkat/later.

requireJS an entire folder

Is it possible to "require" an entire folder using requireJS.
For example, I have a behaviors folder with a ton of behavior js files. I'd really like to be able to simply use require(['behaviors/*'], function() {...}); to load everything in that folder rather than having to keep that list up to date. Once compressed and optimized I'd have all those files lump together, but for development it's easier to work with them individually.
javascript in browser has no filesystem access and so it can't scan a directory for files. If you are building your app in a scripting language like php or ruby you could write a script that scans the directory and adds the file names to the require() call.
I don't know if I can recommend this approach anymore. I think the more explicit way to do this is by manually "requiring"/"exporting" the functionality you need. The exception I think is if you have a "namespace" of files that you want exported see below "Babel and ES6 Module Import Declarations (export-namespace-from) or see below "Babel and ES6 Module Import Declarations.
These solutions also assume that you have a meaningful file structure - where file names become part of that "require" * definition.
However, if you still need to do this there are a few existing tools and methods that might provide the behavior that you're looking for.
Possible Solutions
Babel and ES6 Module Import Declarations (plugin-export-namespace-from)
Have a setup that is ES6 compliant.
You need to update your .babelrc file to include babel-plugin-proposal-export-namespace-from.
Use export namespace plugin by writing syntax like the following:
common/index.js
export * from './common/a'; // export const a = false;
export * from './common/b'; // export const b = true;
main.js
import { a, b } from './common';
console.log(a); // false
console.log(b); // true
Babel and ES6 Module Import Declarations (plugin-wildcard)
Have a setup that is ES6 compliant.
You need to update your .babelrc file to include babel-plugin-wildcard.
Use wildcard namespace plugin by writing syntax like the following:
main.js
import { a, b } from './common/*'; // imports './common/a.js' and './common/b.js'
console.log(a); // false
console.log(b); // true
RequireJS (Now Outdated)
Download and install require-wild npm install require-wild
Configure the declaration as follows
grunt.initConfig({
requireWild: {
app: {
// Input files to look for wildcards (require|define)
src: ["./**/*.js"],
// Output file contains generated namespace modules
dest: "./namespaces.js",
// Load your require config file used to find baseUrl - optional
options: { requireConfigFile: "./main.js" }
}
}
});
grunt.loadNpmTasks("require-wild");
grunt.registerTask('default', ['requireWild']);
Then run the grunt task. Your file will be generated. Modify your setup to load namespaces.js
require(['namespaces'], function () { ... });
This now allows modules under src to use dependencies glob pattern matching.
require(['behaviors/**/*'], function (behaviors) { }
I know this is old, but I'd like to share my solution:
For this solution you need JQuery
1) Create a bash script that will list all the js files in
"MyDirectory/", and save it to "directoryContents.txt":
#!/bin/bash
#Find all the files in that directory...
for file in $( find MyDirectory/ -type f -name "*.js" )
do
fileClean=${file%.js} #Must remove .js from the end!
echo -n "$fileClean " >> MyDirectory/directoryContents.txt
done
File will look like this:
MyDirectory/FirstJavascriptFile MyDirectory/SecondJavascriptFile
MyDirectory/ThirdJavascriptFile
Problem with my script! Puts an extra " " at the end, that messes things up! Make sure to remove the excess space at the end of directoryContents.txt
2) Then in your Client side JS code:
do a "GET" request to retrieve the text file
For each entry (split by the space), 'require' that file:
.
$.get( "MyDirectory/directoryContents.txt", {}, function( data ) {
var allJsFilesInFolder = data.split(" ");
for(var a=0; a<allJsFilesInFolder.length; a++)
{
require([allJsFilesInFolder[a]], function(jsConfig)
{
//Done loading this one file
});
}
}, "text");
I was having a problem with this code not finishing before my other code, so Here's my extended answer:
define([''], function() {
return {
createTestMenu: function()
{
this.loadAllJSFiles(function(){
//Here ALL those files you need are loaded!
});
},
loadAllJSFiles: function(callback)
{
$.get( "MyDirectory/directoryContents.txt", {}, function( data ) {
var allJsFilesInFolder = data.split(" ");
var currentFileNum = 0;
for(var a=0; a<allJsFilesInFolder.length; a++)
{
require([allJsFilesInFolder[a]], function(jsConfig)
{
currentFileNum++;
//If it's the last file that needs to be loaded, run the callback.
if (currentFileNum==allJsFilesInFolder.length)
{
console.log("Done loading all configuration files.");
if (typeof callback != "undefined"){callback();}
}
});
}
}, "text");
}
}
});
What I ended up doing was everytime my Node server boots, it will run the bash script, populating directoryContents.txt. Then My client side just reads directoryContents.txt for the list of files, and requires each in that list.
Hope this helps!
There isn't really a way to do this conceptually on the fly (that I know of).
There's a few work arounds though:
Use grunt and concat and then just require that behemoth...I know, kinda sucky.
What I think is a better solution... use a require hierarchy like so:
require('/js/controllers/init', function(ctrls){
ctrls(app, globals);
});
// /js/controllers/init.js
define('js/controllers/index', 'js/controllers/posts', function(index, posts){
return function protagonist(app, globals){
var indexModule = index(app, globals);
var indexModule = posts(app, globals);
return app || someModule;
};
});
// /js/controllers/index.js
define('js/controllers/index', 'js/controllers/posts', function(index, posts){
return function protagonist(app, globals){
function method1(){}
function method2(){}
return {
m1: method1,
m2: method2
};
};
});
Note that "protagonist" function. That allows you to initialize modules before their use, so now you can pass in a 'sandbox' -- in this case app and globals.
Realistically, you wouldn't have /js/controllers/index.js... It should probably be something like /js/controllers/index/main.js or /js/controllers/index/init.js so that there is a directory adjacent to (sibling of) /js/controllers/init.js called "index". This will make your modules scalable to a given interface -- you can simply swap modules out and keep your interface the same.
Hope this helps! Happy coding!
I wrote a library to solve this problem. Eventually someone else came along and improved my library, here it is:
https://github.com/smartprocure/directory-metagen
You can use my lib with Gulp or whatever - it generates metadata for your project and RequireJS can use that metadata to require the desired files from the filesystem.
Using this lib will produce a RequireJS module that looks something like this:
define(
[
"text!app/templates/dashboardTemplate.ejs",
"text!app/templates/fluxCartTemplate.ejs",
"text!app/templates/footerTemplate.ejs",
"text!app/templates/getAllTemplate.ejs",
"text!app/templates/headerTemplate.ejs",
"text!app/templates/homeTemplate.ejs",
"text!app/templates/indexTemplate.ejs",
"text!app/templates/jobsTemplate.ejs",
"text!app/templates/loginTemplate.ejs",
"text!app/templates/overviewTemplate.ejs",
"text!app/templates/pictureTemplate.ejs",
"text!app/templates/portalTemplate.ejs",
"text!app/templates/registeredUsersTemplate.ejs",
"text!app/templates/userProfileTemplate.ejs"
],
function(){
return {
"templates/dashboardTemplate.ejs": arguments[0],
"templates/fluxCartTemplate.ejs": arguments[1],
"templates/footerTemplate.ejs": arguments[2],
"templates/getAllTemplate.ejs": arguments[3],
"templates/headerTemplate.ejs": arguments[4],
"templates/homeTemplate.ejs": arguments[5],
"templates/indexTemplate.ejs": arguments[6],
"templates/jobsTemplate.ejs": arguments[7],
"templates/loginTemplate.ejs": arguments[8],
"templates/overviewTemplate.ejs": arguments[9],
"templates/pictureTemplate.ejs": arguments[10],
"templates/portalTemplate.ejs": arguments[11],
"templates/registeredUsersTemplate.ejs": arguments[12],
"templates/userProfileTemplate.ejs": arguments[13]
}
});
You can then require modules in your front-end like so:
var footerView = require("app/js/jsx/standardViews/footerView");
however, as you can see this is too verbose, so the magic way is like so:
name the dependency above as allViews!
now you can do:
var allViews = require('allViews');
var footerView = allViews['standardViews/footerView'];
There are two advantages to requiring directories whole:
(1) in production, with the r.js optimizer, you can point to one dependency (module A) and it can then easily trace all of A's dependencies that represent a entire directory
(2) in development, you can require whole directories up front and then use synchronous syntax to require dependencies because you know they have already been loaded
enjoy "RequireJS-Metagen"
https://github.com/smartprocure/directory-metagen
https://www.npmjs.com/package/requirejs-metagen
https://github.com/ORESoftware/requirejs-metagen

Categories

Resources