Webpack replace module runtime - javascript

I have a rather complex scenario.
We are building a desktop application with React which is wrapped with Electron, Webpack takes care of the Babel transpilation and chunking.
The application receives configuration data from a cms.
Part of the configuration may be a javascript class that needs to override one that resides in the application. The JS code as specified in the CMS will be vanilla Javascript code (ES6/7/8 same as what we use for the application)
I see 2 problems here:
How to transpile just this one class and
How to replace it runtime in the application
Is this even possible?
Regards

If with "The application receives configuration data from a cms." you mean runtime data, then, because Webpack acts at compile time, it cannot help you to transpile/replace your code (Runtime vs Compile time).
if your data from a CMS can be fetched at compile time, then, notice that you can return a promise from webpack.config.js.
module.exports = function webpackConfig(env) {
const configs = {
context: __dirname,
plugins: []
// etc...
};
return CMS
.fetchConfig()
.then(cmsConfigs => {
const vars = {
replaceClass: JSON.stringify(cmsConfigs.classINeed.toString())
};
configs.plugins.push(new webpack.DefinePlugin(vars));
return configs;
})
;
}

Related

How can I use switch between different environments using the new cypress.config.js in Cypress 10x?

I used to have json files under my config folder containing variables for different environments. For example:
local.env.json would contain:
{
"baseUrl": "localhost:8080"
}
then another one called uat.env.json would contain:
{
"baseUrl": "https://uat.test.com"
}
and its configured on my plugins/index.ts as:
const version = config.env.version || 'uat'; // if version is not defined, default to this stable environment
config.env = require(`../../config/${version}.env.json`); // load env from json
I will then call it on my tests with cy.visit(Cypress.env().baseUrl)) then pass it on the CI with CYPRESS_VERSION=uat npx cypress run
However, with the new Cypress 10x version, the plugin file has been deprecated and just relies on cypress.config.js. I can't find any example on their documentation on how this can be done (I remember they used to have a page with these scenarios but can't find it now).
It's possible to use the old plugins/index.ts in the new cypress.config.ts by importing it.
This is the simplest example (with no new config in cypress.config.ts)
import { defineConfig } from 'cypress'
import legacyConfig from './cypress/plugins/index.js'
export default defineConfig({
e2e: {
baseUrl: 'http://localhost:1234',
setupNodeEvents(on, config) {
return legacyConfig(on, config) // call legacy config fn and return result
}
}
})
The typescript and modules may cause you grief about typings etc. I've not tried it in a typescript project, but it does work in a javascript project.
Alternatively, copy/paste everything from plugins/index.ts instead of calling the legacy function. This might be better as you add more plugins in the future.

Application modularity with Vue.js and local NPM packages

I'm trying to build a modular application in Vue via the vue-cli-service. The main app and the modules are separated projects living in different folders, the structure is something like this:
-- app/package.json
/src/**
-- module1/package.json
/src**
-- module2/package.json
/src**
The idea is to have the Vue app completely agnostic about the application modules that can be there at runtime, the modules themself are compiled with vue-cli-service build --target lib in a local moduleX/dist folder, pointed with the package.json "main" and "files" nodes.
My first idea (now just for development speed purposes) was to add the modules as local NPM packages to the app, building them with a watcher and serving the app with a watcher itself, so that any change to the depending modules would (I think) be distributed automatically to the main app.
So the package.json of the app contains dependencies like:
...
"module1": "file:../module1",
"module2": "file:../module2",
...
This dependencies are mean to be removed at any time, or in general be composed as we need, the app sould just be recompiled and everything should work.
I'm trying to understand now how to dynamically load and activate the modules in the application, as I cannot use the dynamic import like this:
import(/* webpackMode: "eager" */ `module1`).then(src => {
src.default.boot();
resolve();
});
Because basically I don't know the 'module1', 'module2', etc...
In an OOP world I would just use dependency injection retrieving classes implementing a specific interface, but in JS/TS I'm not sure it is viable.
There's a way to accomplish this?
Juggling with package.json doesn't sound like a good idea to me - doesn't scale. What I would do:
Keep all available "modules" in package.json
Create separate js file (or own prop inside package.json) with all available configurations (for different clients for example)
module.exports = {
'default': ['module1', 'module2', 'module3'],
'clientA': ['module1', 'module2', 'module4'],
'clientB': ['module2', 'module3', 'module4']
}
tap into VueCLI build process - best example I found is here and create js file which will run before each build (or "serve") and using simple template (for example lodash) generate new js file which will boot configured modules based on the value of some ENV variable. See following (pseudo)code (remember this runs inside node during build):
const fs = require('fs')
const _ = require('lodash')
const modulesConfig = require(`your module config js`)
const configurationName = process.env.MY_APP_CONFIGURATION ?? 'default'
const modules = modulesConfig[configurationName]
const template = fs.loadFileSync('name of template file')
const templateCompiled = _.template(template)
const generatedJS = templateCompiled({ `modules`: modules })
fs.writeFileSync('bootModules.js', generatedJS)
Write your template for bootModules.js. Simplest would be:
<% _.forEach(modules , function(module) { %>import '<%= module %>' as <%= module %><% }); %>;
import bootModules.js into your app
Use MY_APP_CONFIGURATION ENV variable to switch desired module configuration - works not just during development but you can also setup different CI processes targeting same repo with just different MY_APP_CONFIGURATION values
This way you have all configurations at one place, you don't need to change package.json before every build, you have simple mechanism to switch between different module configurations and every build (bundle) contains only the modules needed....
In an OOP world I would just use dependency injection retrieving classes implementing a specific interface, but in JS/TS I'm not sure it is viable.
Why not?
More than this, with JS/TS you are not restricted to use classes implementing a specific interface: you just need to define the interface (i.e. the module.exports) of your modules and respecting it in the libraries entries (vue build lib).
EDIT: reading comments probably I understood the request.
Each module should respect following interface (in the file which is the entry of the vue library)
export function isMyAppModule() {
return true;
}
export function myAppInit() {
return { /* what you need to export */ };
}
Than in your app:
require("./package.json").dependencies.forEach(name => {
const module = require(name);
if(! module.isMyAppModule || module.isMyAppModule() !== true) return;
const { /* the refs you need */ } = module.myAppInit();
// use your refs as you need
});

How to use google-closure-compiler-js for a node.js app without gulp/grunt/webpack?

The docs don't have any examples of using this on its own but they do say this:
Unless you're using the Gulp or Webpack plugins, you'll need to specify code via flags. Both jsCode and externs accept an array containing objects in the form {src, path, sourceMap}. Using path, you can construct a virtual filesystem for use with ES6 or CommonJS imports—although for CommonJS, be sure to set processCommonJsModules: true.
I've created a "compile.js" file based on the docs:
const compile = require('google-closure-compiler-js').compile;
const flags = {
jsCode: [{path: './server/server.js'}],
processCommonJsModules: true
};
const out = compile(flags);
console.info(out.compiledCode);
In my "./server/server.js" file, I put a console.log but it doesn't output. Not sure where to go from here...
Borrowing from icidasset/quotes.
It appears, to me, that path is not intended to be used as you are using it.
Quote:
Using path, you can construct a virtual filesystem for use with ES6 or CommonJS imports—although for CommonJS, be sure to set processCommonJsModules: true.
So instead you must expand your own sources, something webpack and gulp must be doing for you when you go that route.
files=['./server/server.js']
files.map(f => {
const out = compile({
jsCode: [{ src: f.content }],
assumeFunctionWrapper: true,
languageIn: 'ECMASCRIPT5'
});
return out;
}

Different settings for debug/local ("grunt serve") vs. dist/build ("grunt")?

I want to define some application settings, but I want to provide different values depending on whether I'm running in 'debug' mode (e.g. grunt serve), or whether the final compiled app is running (e.g. the output of grunt). That is, something like:
angular.module('myApp').factory('AppSettings', function() {
if (DebugMode()) { // ??
return { apiPort: 12345 };
} else {
return { apiPort: 8008 };
}
});
How can I accomplish this?
The way I handle it in my apps:
move all your config data for one environment to a file: config.js, config.json,... whatever your app finds easy to read.
now modify your config file to turn it into a template using grunt config values, and generate the file with grunt-template as part of your build - for example: app.constant('myAppConfig', {bananaHammocks: <%= banana.hammocks %>});
finally, add grunt-stage to switch grunt config values depending on environment: create your different config/secret/(env).json files, update your template (app.constant('myAppConfig', {bananaHammocks: <%= stg.banana.hammocks %>});), and then grunt stage:local:build or grunt stage:prod:build
I find this the good balance between complexity and features (separation between environments, runtime code not concerned with building options,...)

JavaScript require() on client side

Is it possible to use require() (or something similar) on client side?
Example
var myClass = require('./js/myclass.js');
You should look into require.js or head.js for this.
I've been using browserify for that. It also lets me integrate Node.js modules into my client-side code.
I blogged about it here: Add node.js/CommonJS style require() to client-side JavaScript with browserify
If you want to have Node.js style require you can use something like this:
var require = (function () {
var cache = {};
function loadScript(url) {
var xhr = new XMLHttpRequest(),
fnBody;
xhr.open('get', url, false);
xhr.send();
if (xhr.status === 200 && xhr.getResponseHeader('Content-Type') === 'application/x-javascript') {
fnBody = 'var exports = {};\n' + xhr.responseText + '\nreturn exports;';
cache[url] = (new Function(fnBody)).call({});
}
}
function resolve(module) {
//TODO resolve urls
return module;
}
function require(module) {
var url = resolve(module);
if (!Object.prototype.hasOwnProperty.call(cache, url)) {
loadScript(url);
}
return cache[url];
}
require.cache = cache;
require.resolve = resolve;
return require;
}());
Beware: this code works but is incomplete (especially url resolving) and does not implement all Node.js features (I just put this together last night).
YOU SHOULD NOT USE THIS CODE in real apps but it gives you a starting point. I tested it with this simple module and it works:
function hello() {
console.log('Hello world!');
}
exports.hello = hello;
I asked myself the very same questions. When I looked into it I found the choices overwhelming.
Fortunately I found this excellent spreadsheet that helps you choice the best loader based on your requirements:
https://spreadsheets.google.com/lv?key=tDdcrv9wNQRCNCRCflWxhYQ
Take a look at requirejs project.
I have found that in general it is recommended to preprocess scripts at compile time and bundle them in one (or very few) packages with the require being rewritten to some "lightweight shim" also at compile time.
I've Googled out following "new" tools that should be able to do it
http://mixu.net/gluejs/
https://github.com/jrburke/almond
https://github.com/component/builder2.js
And the already mentioned browserify should also fit quite well - http://esa-matti.suuronen.org/blog/2013/04/15/asynchronous-module-loading-with-browserify/
What are the module systems all about?
Older Stack Overflow explanation - Relation between CommonJS, AMD and RequireJS?
Detailed discussion of various module frameworks and the require() they need is in Addy Osmani - Writing Modular JavaScript With AMD, CommonJS & ES Harmony
You can create elements to the DOM, which loads items.
Like such:
var myScript = document.createElement('script'); // Create new script element
myScript.type = 'text/javascript'; // Set appropriate type
myScript.src = './js/myclass.js'; // Load javascript file
Simply use Browserify, what is something like a compiler that process your files before it go into production and packs the file in bundles.
Think you have a main.js file that require the files of your project, when you run browserify in it, it simply process all and creates a bundle with all your files, allowing the use of the require calls synchronously in the browser without HTTP requests and with very little overhead for the performance and for the size of the bundle, for example.
See the link for more info: http://browserify.org/
Some answers already - but I would like to point you to YUI3 and its on-demand module loading. It works on both server (node.js) and client, too - I have a demo website using the exact same JS code running on either client or server to build the pages, but that's another topic.
YUI3: http://developer.yahoo.com/yui/3/
Videos: http://developer.yahoo.com/yui/theater/
Example:
(precondition: the basic YUI3 functions in 7k yui.js have been loaded)
YUI({
//configuration for the loader
}).use('node','io','own-app-module1', function (Y) {
//sandboxed application code
//...
//If you already have a "Y" instance you can use that instead
//of creating a new (sandbox) Y:
// Y.use('moduleX','moduleY', function (Y) {
// });
//difference to YUI().use(): uses the existing "Y"-sandbox
}
This code loads the YUI3 modules "node" and "io", and the module "own-app-module1", and then the callback function is run. A new sandbox "Y" with all the YUI3 and own-app-module1 functions is created. Nothing appears in the global namespace. The loading of the modules (.js files) is handled by the YUI3 loader. It also uses (optional, not show here) configuration to select a -debug or -min(ified) version of the modules to load.
Here's a solution that takes a very different approach: package up all the modules into a JSON object and require modules by reading and executing the file content without additional requests.
https://github.com/STRd6/require/blob/master/main.coffee.md
STRd6/require depends on having a JSON package available at runtime. The require function is generated for that package. The package contains all the files your app could require. No further http requests are made because the package bundles all dependencies. This is as close as one can get to the Node.js style require on the client.
The structure of the package is as follows:
entryPoint: "main"
distribution:
main:
content: "alert(\"It worked!\")"
...
dependencies:
<name>: <a package>
Unlike Node a package doesn't know it's external name. It is up to the pacakge including the dependency to name it. This provides complete encapsulation.
Given all that setup here's a function that loads a file from within a package:
loadModule = (pkg, path) ->
unless (file = pkg.distribution[path])
throw "Could not find file at #{path} in #{pkg.name}"
program = file.content
dirname = path.split(fileSeparator)[0...-1].join(fileSeparator)
module =
path: dirname
exports: {}
context =
require: generateRequireFn(pkg, module)
global: global
module: module
exports: module.exports
PACKAGE: pkg
__filename: path
__dirname: dirname
args = Object.keys(context)
values = args.map (name) -> context[name]
Function(args..., program).apply(module, values)
return module
This external context provides some variable that modules have access to.
A require function is exposed to modules so they may require other modules.
Additional properties such as a reference to the global object and some metadata
are also exposed.
Finally we execute the program within the module and given context.
This answer will be most helpful to those who wish to have a synchronous node.js style require statement in the browser and are not interested in remote script loading solutions.
I find the component project giving a much more streamlined workflow than other solutions (including require.js), so I'd advise checking out https://github.com/component/component . I know this is a bit late answer but may be useful to someone.
Here's a light weight way to use require and exports in your web client. It's a simple wrapper that creates a "namespace" global variable, and you wrap your CommonJS compatible code in a "define" function like this:
namespace.lookup('org.mydomain.mymodule').define(function (exports, require) {
var extern = require('org.other.module');
exports.foo = function foo() { ... };
});
More docs here:
https://github.com/mckoss/namespace
The clientside-require library provides an asynchronous load() function that can be used to load any JS file or NPM module (which uses module.exports), any .css file, any .json, any .html, any any other file as text.
e.g.,
npm install clientside-require --save
<script src = '/node_modules/clientside-require/dist/bundle.js'></script>
<script>
load('color-name') // an npm module
.then(color_name=>{
console.log(color_name.blue); // outputs [0, 0, 255]
})
</script>
A really cool part of this project is that inside of any load()ed script, you can use the synchronous require() function the same way you would expect in node.js!
e.g.,
load('/path/to/functionality.js')
and inside /path/to/functionality.js:
var query_string = require("qs") // an npm module
module.exports = function(name){
return qs.stringify({
name:name,
time:new Date()
}
}
That last part, implementing the synchronous require() method, is what enables it to utilize NPM packages built to run on the server.
This module was designed to implement the require functionality as closely as possible in the browser. Disclaimer: I have written this module.
Yes it is very easy to use, but you need to load javascript file in browser by script tag
<script src="module.js"></script>
and then user in js file like
var moduel = require('./module');
I am making a app using electron and it works as expected.

Categories

Resources