node.js / backbone.js application load configuration file settings - javascript

I have a node / backbone / sails project that needs to have deployment specific configuration files loaded.
So in the sails portion of the app, I can place a config file in
myapp/config/mysettings.js
and reference it sails.config.mysettings.foo. This works as expected.
But in the backbone portion of the app, I cannot for the life of me figure out how to reference that same file (... are snips for brevity).
define([
'jquery',
'async',
...,
**'/config/mysettings.js'**
], function ($, async, ..., **mysettings**) {
relevantAjaxFunction: function() {
...
**fail**:
console.log(mysettings.foo);
Produces an undefined message in the console. What's the correct way to reference an application wide settings file like this? I've looked and cannot find anything, which makes me think it's either super obvious or I'm phrasing the question wrong.

Sails doesn't make your config files publicly browsable. That would be...bad.
You have a few options here that I can think of:
Make a symlink inside your assets folder that points to the config file. The tricky part here is that the symlink will be copied as-is into your .tmp/public folder, so if you use a relative path in your symlink, it needs to be the relative path from .tmp/public to config/mysettings.js. Or you could just use an absolute path in the symlink.
Make a custom route that just streams the file using fs:
require('fs')
.createReadStream(sails.config.paths.config+"/mysettings.js")
.pipe(res);
although you should really use the path module for determining the file path, and wrap the whole thing in a try/catch just in case...
Use a policy to add the config settings in a local variable for every request, for example:
module.exports = function myConfigPolicy(req, res, next) {
res.locals.mysettings = sails.config.mysettings;
return next();
}
and in your views/layout.ejs do something like:
<script type="text/javascript">
var mySettings = <%= JSON.stringify(mysettings) %>;
</script>
Adjusting for your template engine of choice. Of course this doesn't use AMD to load the config at runtime, so if that's a concern go with choice #1 or #2!

Related

Next.js - best way to serve static JS from a node module's "dist" folder

I'm working with an application that uses Tesseract (OCR) to read text from images.
I would like to take some JS files from node_modules/tesseract.js/dist and make them downloadable in the browser.
I know I can just copy the files to ./public and next.js will serve it statically from there, but then if I update my version of Tesseract, I may need to update those files as well. So maintenance becomes a problem.
My 1st thought is to customize my webpack config to copy the files from node_modules/tesseract.js/dist to ./public/tesseract (or something like that). That would make sure the files get re-copied if I update Tesseract. But I'm not particularly savvy with webpack and haven't figured out how to do that yet.
My 2nd thought was to "proxy" the retrieval of the JS file's content and just make the content available as a "page" in next.js (but this seems super hacktastic).
I feel like this is something that shouldn't be so complicated ... but I've not been able to figure it out myself yet.
Thanks in advance!
Yup agreed, updating your server to serve a node_modules path sounds a bit dangerous.
I personally would just copy over these files with webpack like you mentioned.
Here are the docs on Next.js on how to set up a custom webpack config.
next.config.js
const CopyPlugin = require("copy-webpack-plugin");
module.exports = {
webpack: (config) => {
// append the CopyPlugin to copy the file to your public dir
config.plugins.push(
new CopyPlugin({
patterns: [
{ from: "node_modules/tesseract.js/dist", to: "public/" },
],
}),
)
// Important: return the modified config
return config
}
};
I purposefully didn't include the public/tesseract path, I'm not sure if the CopyPlugin will automatically generate missing directories if they don't exist at build time.

How to manage configuration for Webpack/Electron app?

I am using Webpack 2 and Electron to build nodejs application on Mac.
In my project in the root I have directory 'data' where I store configuration in a json like data/configurations/files.json (in practices there are different files with dynamic names)
After webpackaing though when I call: fs.readdirSync(remote.app.getAppPath()); to get files in the root I get only these packed: [ "default_app.js", "icon.png", "index.html", "main.js", "package.json", "renderer.js" ]
path.join(remote.app.getAppPath(), 'data/tests/groups.json'); called with FS ReadSync leads to an issue Error: ENOENT, data/tests/groups.json not found in /Users/myuser/myproject/node_modules/electron/dist/Electron.‌​app/Contents/Resourc‌​es/default_app.asar. So it seems that the whole data folder is not picked up by webpacker.
Webpack config is using json-loader and I did not find any documentation mentioning anything special about including specific files or jsons. Or do I have to reference json files in my code differently as they might be packed under main.js.
What is the best practice for Electron/Webpack for managing JSON config files? Am I doing something wrong when webpacking the project?
My project is based of https://github.com/SimulatedGREG/electron-vue using webpack/electron/vue
The Webpack Misconception
One thing to understand upfront is that webpack does not bundle files required through fs or other modules that ask for a path to a file. These type of assets are commonly labeled as Static Assets, as they are not bundled in any way. webpack will only bundle files that are required or imported (ES6). Furthermore, depending on your webpack configuration, your project root may not always match what is output within your production builds.
Based on the electron-vue documentation's Project Structure/File Tree, you will find that only webpack bundles and the static/ directory are made available in production builds. electron-vue also has a handy __static global variable that can provide a path to that static/ folder within both development and production. You can use this variable similar to how one would with __dirname and path.join to access your JSON files, or really any files.
A Solution to Static Assets
It seems the current version of the electron-vue boilerplate already has this solved for you, but I'm going to describe how this is setup with webpack as it can apply to not only JSON files and how it can also apply for any webpack + electron setup. The following solution assumes your webpack build outputs to a separate folder, which we'll use dist/ in this case, assumes your webpack configuration is located in your project's root directory, and assumes process.env.NODE_ENV is set to development during development.
The static/ directory
During development we need a place to store our static assets, so let's place them in a directory called static/. Here we can put files, such as JSONs, that we know we will need to read with fs or some other module that requires a full path to the file.
Now we need to make that static/ assets directory available in production builds.
But webpack isn't handling this folder at all, what can we do?
Let's use the simple copy-webpack-plugin. Within our webpack configuration file we can add this plugin when building for production and configure it to copy the static/ folder into our dist/ folder.
new CopyWebpackPlugin([
{
from: path.join(__dirname, '/static'),
to: path.join(__dirname, '/dist/static'),
ignore: ['.*']
}
])
Okay so the assets are in production, but how do I get a path to this folder in both development and production?
Creating a global __static variable
What's the point of making this __static variable?
Using __dirname is not reliable in webpack + electron setups. During development __dirname could be in reference to a directory that exists in your src/ files. In production, since webpack bundles our src/ files into one script, that path you formed to get to static/ doesn't exist anymore. Furthermore, those files you put inside src/ that were not required or imported never make it to your production build.
When handling the project structure differences from development and production, trying to get a path to static/ will be highly annoying during development having to always check your process.env.NODE_ENV.
So let's simplify this by creating one source of truth.
Using the webpack.DefinePlugin we can set our __static variable only in development to yield a path that points to <projectRoot>/static/. Depending if you have multiple webpack configurations, you can apply this for both a main and renderer process configuration.
new webpack.DefinePlugin({
'__static': `"${path.join(__dirname, '/static').replace(/\\/g, '\\\\')}"`
})
In production, we need to set the __static variable manually in our code. Here's what we can do...
index.html (renderer process)
<!-- Set `__static` path to static files in production -->
<script>
if (process.env.NODE_ENV !== 'development') window.__static = require('path').join(__dirname, '/static').replace(/\\/g, '\\\\')
</script>
<!-- import webpack bundle -->
main.js (main process)
// Set `__static` path to static files in production
if (process.env.NODE_ENV !== 'development') {
global.__static = require('path').join(__dirname, '/static').replace(/\\/g, '\\\\')
}
// rest of application code below
Now start using your __static variable
Let's say we have a simple JSON file we need to read with fs, here's what we can accomplish now...
static/someFile.json
{"foo":"bar"}
someScript.js (renderer or main process)
import fs from 'fs'
import path from 'path'
const someFile = fs.readFileSync(path.join(__static, '/someFile.json'), 'utf8')
console.log(JSON.parse(someFile))
// => { foo: bar }
Conclusion
webpack was made to bundle assets together that are required or imported into one nice bundle. Assets referenced with fs or other modules that need a file path are considered Static Assets, and webpack does not directly handle these. Using copy-webpack-plugin and webpack.DefinePlugin we can setup a reliable __static variable that yields a path to our static/ assets directory in both development and production.
To end, I personally haven't seen any other webpack + electron boilerplates handle this situation as it isn't a very common situation, but I think we can all agree that having one source of truth to a static assets directory is a wonderful approach to alleviate developer fatigue.
I think the confusion, (if there is any), might come from the fact that webpack not only "packs", embeds, things, code, etc... but also process content with its plugins.
html plugin being a good example, as it simply generates an html file at build-time.
And how this relates to the config file issue?,
well depending on how you are "requiring" the "config" file, what plug-in you are using to process that content.
You could be embedding it, or simply loading it as text, from file system or http, or else...
In the case of a config file, that I guess you want it to be parsed at runtime,
otherwise it's just fancy hardcoding values that perhaps you could be better simply typing it in your source code as simple objects.
And again in that case I think webpack adds little to nothing to the runtime needs, as there is nothing to pre-pack to read at later use,
so I would possibly instead or "require"it, i'll read it from the file system, with something like :
// read it parse it relative to appPath/cwd,
const config = JSON.parse(
fs.readfileSync(
path.join( app.getAppPath(), "config.json" ),
"utf-8"
))
//note: look fs-extra, it does all that minus the app.path plus async
and electron will read it from the file system , or if using Electron.require will read it from asar|fileSystem (in that order if I remember correctly, I could be wrong),
Webpack design philosophy is focused around very simple yet powerful concept:
Transform and bundle everything that is actually used by your app.
To achieve that webpack introduces a powerful concept of dependency graph, which is able to manage virtually any kind of dependencies (not only *.js modules) by the means of so-called loaders.
The purpose of a loader is to transform your dependency in a way that makes statement import smth from 'your_dependency' meaningful. For instance, json-loader calls JSON.parse(...) during loading of *.json file and returns configuration object. Therefore, in order to take advantage of webpack dependency resolution system for managing JSONs, start from installing json-loader:
$ npm install --save-dev json-loader
Then modify your webpack.config.js in the following way:
module.exports = {
...
module: {
rules: [
{test: /\.json$/, use: 'json-loader'}
]
}
...
};
At this point webpack should be able to resolve your JSON dependencies by their absolute paths, so the following should work (I assume here that you have a subdirectory config of your root context dir, containing file sample.json):
import sampleCfg from './config/sample.json';
But importing physical paths doesn't lead to elegant, robust and maintainable code (think of testability, for example), so it is considered a good practice to add aliases to your webpack.config.js for abstracting away your physical .config/ folder from your import statements
module.exports = {
...
resolve: {
alias: {
cfg: './config'
}
}
...
}
Then you'll be able to import your JSON config like that:
import sampleCfg from 'cfg/sample.json'
Finally, if you use SimulatedGREG/electron-vue Electron project template (as you mentioned in your post), then you have three webpack configuration files:
.electron-vue/webpack.web.config.js - use this config file if you use this template just for ordinary web development (i.e. not for building native Electron projects);
.electron-vue/webpack.main.config.js - use this file to configure webpack module that will run inside Electron's main process;
.electron-vue/webpack.renderer.config.js - use this file for Electron's renderer process.
You can find more information on main and renderer processes in the official Electron documentation.

Run intern functional tests against static web app on localhost

I have a static web app. Html, JS (requirejs modules), and some CSS.
Currently the 'serverUrl' is being set through a property module, which i can 'require' and use values from it:
define({
serverUrl: 'https://some.api/path/'
})
I have Intern setup to run functional tests in the browser using src/index.html as the entry point.
return this.remote
.get(require.toUrl('src/index.html'))
Given the serverUrl is hardcoded in the properties file, I'm trying to find a way to run tests against the web app where serverUrl is pointing to localhost:1234/someFakeServer so I can test error scenarios and such.
I've trawled the web but can't find anyone doing anything remotely similar to me, which makes me think I'm doing something obviously wrong. There's solutions for NODE apps, using config modules, but because I never 'start' my web app - its just files, these won't work for me.
Some solutions I've thought about, but can't figure out how to achieve:
Intern is proxying the files on :9000, so if I can somehow 'build' an application with another properties file pointing to localhost, all is good. But I've no idea how to do that - I've looked at webpack and similar but they don't seem to do what I want.
I've looked at Interns 'Setup' config item, which allows a function to be run before the tests are started - so I thought about modifying the properties file in there, but seems too hacky and not sure how I'd put it back...
Assuming the properties file is accessible to Intern, you could simply have Intern load the properties file and pull the server URL out of it. If you have multiple potential properties files, the one being used can be set in the Intern config or passed in as a custom command line variable (which would be used to set a property in the Intern config). The test module can get the name of the properties file from Intern's config and then load the relevant file. It could look something like this (untested):
// intern config
define({
// ...
propertiesFile: 'whatever',
})
// test file
define([ 'intern', ... ], function (intern, ...) {
registerSuite({
// ...
'a test': function () {
var did = this.async();
var remote = this.remote;
require([
intern.config.propertiesFile
], dfd.callback(function (props) {
return remote.get(props.url)
.otherStuff
});
}
});
});

How do I connect bower components with sails.js?

I'd like to be able to install Javascript dependencies through bower and use them in a sails.js app, but I can't figure out a way to do this with out just copying an pasting files from the bower_components folder to the Sails assets folder.
Ideally I think I'd like to use requirejs and point to the bower components in the main.js file. I may be trying to pound a square peg in a round hole, please let me know if so. Any thoughts on managing components/libraries with Sails are welcome.
In Sails 0.10 the 'assets/linker' directory no longer exists, however I found a lead on simple solution that gives some automation to the bower -> asset linker workflow while also allowing some granular control over what exactly ends up getting linked.
The solution is adding grunt-bower to your Sails.js compileAssets task
grunt.registerTask('compileAssets', [
'clean:dev',
'bower:dev',
'jst:dev',
'less:dev',
'copy:dev',
'coffee:dev'
]);
Then configure your grunt-bower task like so (as tasks/config/bower.js):
module.exports = function(grunt) {
grunt.config.set('bower', {
dev: {
dest: '.tmp/public',
js_dest: '.tmp/public/js',
css_dest: '.tmp/public/styles'
}
});
grunt.loadNpmTasks('grunt-bower');
};
This will automatically copy your bower js and css files to the proper place for Sail's asset linker to find and automatically add to your project's layout template. Any other js or css files will still automatically be added after your bower files.
However this is still no silver bullet as this setup has 2 big caveats to it:
1 - The files that are added through bower-grunt have to be listed in bower.json's main array. If you see a file isn't being loaded you would expect to be, you must either edit that packages bower.json file OR add the dependency manually through grunt-bower's packageSpecific options.
2 - The order of bower files in the asset linker is currently alphabetical. My only recourse to adjust this order so far is tinkering around with an additional grunt task to manually re-order files prior to the rest of Sail's compileAssets task. However this one I'm confident there is something grunt-bower could do by supporting package copy ordering.
Note: the following answer is no longer completely relevant to the current version of SailsJS because there is no support for the linker folder as of SailsJS 0.10.
See: Sails not generating linker
Original answer:
I was able to figure out a solution for this, which is actually pretty simple. I had not realized you could configure where bower places it's files.
Create a .bowerrc file and change the directory where bower components are installed, in the case of Sailjs they should be put into the assets folder.
/*
* Create a file called .bowerrc and put the following in it.
* This file should be in the root directory of the sails app.
*/
{
"directory": "assets/linker/bower_components"
}
Sails will then use grunt to copy them to the .tmp/public/assets folder whenever a file is change. If you don't wish to have sails continually deleting and then recopying those files you can exclude them in the grunt file.
/*
* This is not necessary, but if you have a lot of components and don't want
* them constantly being deleted and copied at every file change you can update
* your Gruntfile.js with the below.
*/
clean: {
dev: ['.tmp/public/**',
'!.tmp/public',
'!.tmp/public/bower_components/**'],
build: ['www']
},
One tip on using requirejs with sails. By default you will get an error from the socket.io file since sails will load it without using requirejs. This will throw an error since socket.io supports amd style loading, more details here http://requirejs.org/docs/errors.html#mismatch.
The simplest way to fix this is to just comment out the lines near the end of the socket.io.js.
/*
* Comment the below out in the file assets/js/socket.io.js, if using requirejs
* and you don't want to modify the default sails setup or socket.io.
*/
if (typeof define === "function" && define.amd) {
define([], function () { return io; });
}
The other way would be to recode the sails files in assets/js named "socket.io.js", "sails.io.js" and app.js to be amd modules.
The simplest way I've found of achieving this is simply to add the individual Bower components to your tasks/pipeline.js file. For example, here's how you might add Modernizr:
// Client-side javascript files to inject in order
// (uses Grunt-style wildcard/glob/splat expressions)
var jsFilesToInject = [
// Load sails.io before everything else
'js/dependencies/sails.io.js',
// Dependencies like jQuery, or Angular are brought in here
'js/dependencies/**/*.js',
// =========================================================
// Modernizr:
'bower_components/modernizr/modernizr.js',
// =========================================================
// All of the rest of your client-side js files
// will be injected here in no particular order.
'js/**/*.js'
];
A link to bower_components/modernizr/modernizr.js then gets inserted in the <!--SCRIPTS--> placeholder in your layout template, and it also gets minified, etc, when you run sails lift --prod.
Sorry for my late.
I think include bower_components in linker it's a bad idea, because when you will lift sails, everything in it will be copied in .tmp and include in tags.
For example, if you have include Angular with bower, you will have two inclusions in your script: one for angular.js and one for angular.min.js. And having both is a mistake.
Here is my solution on my projects :
I have created a folder bower_component in my root directory.
I have added this line in my Gruntfile.js in the array files in copy:dev
{ '.tmp/public/linker/js/angular.js': './bower_components/angular/angular.js' }
And for each files I want to include, I need to manually add a new line in this array. I haven't find an another automatic solution. If someone finds a better solution... ;)
There is more than one approach to hooking up SailsJS and Bower.
A package for SailsJS that integrates Bower via a custom generator exists here:
https://www.npmjs.com/package/sails-generate-bower
There is one for Gulp as well.

Is there a way to lazily set the path of a resource with RequireJS?

So, I have an app that is using requireJS. Quite happily. For the most part.
This app makes use of Socket.IO. Socket.IO is being provided by nodejs, and does not run on the same port as the main webserver.
To deal with this, in our main js file, we do something like this:
var hostname = window.location.hostname;
var socketIoPath = "http://" + hostname + ":3000/socket.io/socket.io";
requirejs.config({
baseUrl: "/",
paths: {
app : "scripts/appapp",
"socket.io" : socketIoPath
}
});
More complicated than this, but you get the gist.
Now, in interactive mode, this works swimingly.
The ugliness starts when we try to use r.js to compile this (technically we're using grunt to run r.js, but that's besides the point).
In the config for r.js, we set an empty path for socket.io (to avoid it failing to pull in), and we set our main file as the mainConfigFile.
The compiler yells about this, saying:
Running "requirejs:dist" (requirejs) task
>> Error: Error: The config in mainConfigFile /…/client.js cannot be used because it cannot be evaluated correctly while running in the optimizer. Try only using a config that is also valid JSON, or do not use mainConfigFile and instead copy the config values needed into a build file or command line arguments given to the optimizer.
>> at Function.build.createConfig (/…/r.js:23636:23)
Now, near as I can figure, this is due to the fact that I'm using a variable to set the path for "socket.io". If i take this out, require runs great, but i can't run the raw from a server. If I leave it is, my debug server is happy, but the build breaks.
Is there a way that I can lazily assign the path of "socket.io" at runtime so that it doesn't have to go into the requirejs.config() methos at that point?
Edit: Did some extensive research on this. Here are the results.
Loading from CDN with RequireJS is possible with a build. However, if you're using the smaller Almond loader, it's not possible.
This leaves you with two options:
Use almond along with a local copy of the file in your build.
Use the full require.js loader and try to use a CDN.
Use a <script> tag just for that resource.
I say try for #2 because there are some caveats. You'll need to include require.js in your HTML with the data-main attribute for your built file. But if you do this, require and define will be global functions, allowing users to require any of your internal modules and mess around with them. If you're okay with this, you'll need to follow the "empty: scheme" in your build config (but not in your main config).
But the fact remains that you now have another HTTP request. If you only want one built file, which includes the require.js loader, you'll need to optimize for only one file.
Now, if you want to avoid users being able to require your modules, you'll have to do something like wrap:true in your build. But as far as I can tell, once your module comes down from CDN, if it's AMD, it's going to look for a global define function to register itself with, and that won't exist because it's now wrapped in a closure.
The lesson I took away from all this: inline your resources to your build. It makes sense. You reduce HTTP requests, minify it all and get gzip compression. You don't expose your modules to the world and everything is a lot simpler. If you cache your resources properly you won't even need to worry about it.
But since new versions of socket.io don't like AMD, here's how I did it. Make sure to include the socket.io <script> tag before requirejs. Then create a requirejs module named socket.io with the following contents:
define([], function () {
var io = window.io;
window.io = null;
return io;
});
Set the path like so: 'socket.io': 'core/socket.io' or wherever you want.
And require it as normal! The build works fine this way.
Original answer
Is it possible that you could make use of the path config fallbacks specified in the RequireJS API? Maybe you could save the file locally as a fallback so your build will work.
The socket.io GitHub repository specifies that you can serve the client with the files in the socket.io-client package's dist/ directory.

Categories

Resources