How to get React Native variant in Metro? - javascript

I am developing React Native application that includes different configurations for different possible clients, in a file such as src/config/config.js. These configurations are quite complex. The file is structured based on the client name as key, and the values as the object entries, e.g.:
export default {
fooClient: {
apiUrl: "https://foo.example.com/",
barClient: {
apiUrl: "https://bar.example.com/"
}
}
Of course, there are many other option keys.
When building the app, I know for which client I want to do this, by specifying an Android build variant, e.g.:
ENVFILE=.env npx react-native run-android --variant fooDebug --appIdSuffix foo
For security reasons, I don't want keys of other clients to be included in the config file though. What are my options to remove all other client configs from this file before I build the app and ship it to a client?
I thought about the following: I modify the packager so that it strips out the keys that do not correspond to the current build variant.
I now have a transformer plugin for Metro that does the following:
const upstreamTransformer = require('metro-react-native-babel-transformer');
module.exports.transform = function(src, filename, options) {
if (typeof src === 'object') {
// handle RN >= 0.46
({ src, filename, options } = src);
}
if (filename.endsWith('config.js')) {
console.log('Transforming ' + filename);
let srcStripped = src.replace(';', '').replace('export default ', '');
let configObj = JSON.parse(srcStripped);
// TODO: get the build variant and strip all keys that we do not need from configObj
return upstreamTransformer.transform({
src: 'export default ' + JSON.stringify(configObj) + ';',
filename: filename,
options
});
} else {
return upstreamTransformer.transform({ src, filename, options });
}
};
But how do I know which build variant is being used?
If this seems like an XY problem, I am happy to explore alternatives to building the configuration dynamically. I cannot, however, use environment variables, since the configuration will be too complex for it to be just a list of .env keys.

You shouldn't use Metro transform this way. It's not clean and it may lead to missing configuration and/or damaged syntax sooner or later.
What I have done and suggest you, is to create 3 different configuration files under src/config/; one file for fooClient.js, one file for barClient.js and one last file with common configuration client.js. All files will export default configuration objects, but inside each fooClient and barClient, you'll use deepmerge module to merge client.js config:
client.js:
export default {
commonSettingA: "...",
commonSettings: {
...
}
...
}
fooClient.js:
import merge from 'deepmerge';
import config from './config';
export default merge.all([
config,
{
apiUrl: "https://foo.example.com/",
}
]);
barClient.js:
import merge from 'deepmerge';
import config from './config';
export default merge.all([
config,
{
apiUrl: "https://bar.example.com/",
}
]);
Then you can use an environment variable to pass the needed configuration and create a propriate metro resolve; #react-native-community/cli does not pass command line arguments to metro config script. You can use process.argv to parse it by yourself, but it's not worth it.
Here is how you can create a resolve inside metro.config.js using environment variable:
const path = require("path");
module.exports = {
projectRoot: path.resolve(__dirname),
resolver: {
sourceExts: ['js', 'jsx', 'ts', 'tsx'],
extraNodeModules: {
// Local aliases
"#config": path.resolve(__dirname, "src/config", `${process.env.METRO_VARIANT}Client.js`)
}
}
};
Using this resolve, you'll have to import the configuration like this:
import config from '#config';
Then you add 2 package.json scripts, one for fooClient and one for barClient:
{
...
"scripts": {
"run:android:foo": "METRO_VARIANT=foo ENVFILE=.env npx react-native run-android --variant fooDebug --appIdSuffix foo",
"run:android:bar": "METRO_VARIANT=bar ENVFILE=.env npx react-native run-android --variant barDebug --appIdSuffix bar",
...
}
...
}
Then you just run the needed script:
yarn run:android:foo # will build with fooClient.js
yarn run:android:bar # will build with barClient.js

Can't you just add a "duplicated" parameter to the command?
Like so
METRO_VARIANT=fooDebug ENVFILE=.env npx react-native run-android --variant fooDebug --appIdSuffix foo
And get it with process.env.METRO_VARIANT
You might not want to make your setup much more complex. Depending on the number of the number of different clients, you might wanna go with multiple schemas/variants per client, but that is probably way overkill.

Related

How can I use switch between different environments using the new cypress.config.js in Cypress 10x?

I used to have json files under my config folder containing variables for different environments. For example:
local.env.json would contain:
{
"baseUrl": "localhost:8080"
}
then another one called uat.env.json would contain:
{
"baseUrl": "https://uat.test.com"
}
and its configured on my plugins/index.ts as:
const version = config.env.version || 'uat'; // if version is not defined, default to this stable environment
config.env = require(`../../config/${version}.env.json`); // load env from json
I will then call it on my tests with cy.visit(Cypress.env().baseUrl)) then pass it on the CI with CYPRESS_VERSION=uat npx cypress run
However, with the new Cypress 10x version, the plugin file has been deprecated and just relies on cypress.config.js. I can't find any example on their documentation on how this can be done (I remember they used to have a page with these scenarios but can't find it now).
It's possible to use the old plugins/index.ts in the new cypress.config.ts by importing it.
This is the simplest example (with no new config in cypress.config.ts)
import { defineConfig } from 'cypress'
import legacyConfig from './cypress/plugins/index.js'
export default defineConfig({
e2e: {
baseUrl: 'http://localhost:1234',
setupNodeEvents(on, config) {
return legacyConfig(on, config) // call legacy config fn and return result
}
}
})
The typescript and modules may cause you grief about typings etc. I've not tried it in a typescript project, but it does work in a javascript project.
Alternatively, copy/paste everything from plugins/index.ts instead of calling the legacy function. This might be better as you add more plugins in the future.

Application modularity with Vue.js and local NPM packages

I'm trying to build a modular application in Vue via the vue-cli-service. The main app and the modules are separated projects living in different folders, the structure is something like this:
-- app/package.json
/src/**
-- module1/package.json
/src**
-- module2/package.json
/src**
The idea is to have the Vue app completely agnostic about the application modules that can be there at runtime, the modules themself are compiled with vue-cli-service build --target lib in a local moduleX/dist folder, pointed with the package.json "main" and "files" nodes.
My first idea (now just for development speed purposes) was to add the modules as local NPM packages to the app, building them with a watcher and serving the app with a watcher itself, so that any change to the depending modules would (I think) be distributed automatically to the main app.
So the package.json of the app contains dependencies like:
...
"module1": "file:../module1",
"module2": "file:../module2",
...
This dependencies are mean to be removed at any time, or in general be composed as we need, the app sould just be recompiled and everything should work.
I'm trying to understand now how to dynamically load and activate the modules in the application, as I cannot use the dynamic import like this:
import(/* webpackMode: "eager" */ `module1`).then(src => {
src.default.boot();
resolve();
});
Because basically I don't know the 'module1', 'module2', etc...
In an OOP world I would just use dependency injection retrieving classes implementing a specific interface, but in JS/TS I'm not sure it is viable.
There's a way to accomplish this?
Juggling with package.json doesn't sound like a good idea to me - doesn't scale. What I would do:
Keep all available "modules" in package.json
Create separate js file (or own prop inside package.json) with all available configurations (for different clients for example)
module.exports = {
'default': ['module1', 'module2', 'module3'],
'clientA': ['module1', 'module2', 'module4'],
'clientB': ['module2', 'module3', 'module4']
}
tap into VueCLI build process - best example I found is here and create js file which will run before each build (or "serve") and using simple template (for example lodash) generate new js file which will boot configured modules based on the value of some ENV variable. See following (pseudo)code (remember this runs inside node during build):
const fs = require('fs')
const _ = require('lodash')
const modulesConfig = require(`your module config js`)
const configurationName = process.env.MY_APP_CONFIGURATION ?? 'default'
const modules = modulesConfig[configurationName]
const template = fs.loadFileSync('name of template file')
const templateCompiled = _.template(template)
const generatedJS = templateCompiled({ `modules`: modules })
fs.writeFileSync('bootModules.js', generatedJS)
Write your template for bootModules.js. Simplest would be:
<% _.forEach(modules , function(module) { %>import '<%= module %>' as <%= module %><% }); %>;
import bootModules.js into your app
Use MY_APP_CONFIGURATION ENV variable to switch desired module configuration - works not just during development but you can also setup different CI processes targeting same repo with just different MY_APP_CONFIGURATION values
This way you have all configurations at one place, you don't need to change package.json before every build, you have simple mechanism to switch between different module configurations and every build (bundle) contains only the modules needed....
In an OOP world I would just use dependency injection retrieving classes implementing a specific interface, but in JS/TS I'm not sure it is viable.
Why not?
More than this, with JS/TS you are not restricted to use classes implementing a specific interface: you just need to define the interface (i.e. the module.exports) of your modules and respecting it in the libraries entries (vue build lib).
EDIT: reading comments probably I understood the request.
Each module should respect following interface (in the file which is the entry of the vue library)
export function isMyAppModule() {
return true;
}
export function myAppInit() {
return { /* what you need to export */ };
}
Than in your app:
require("./package.json").dependencies.forEach(name => {
const module = require(name);
if(! module.isMyAppModule || module.isMyAppModule() !== true) return;
const { /* the refs you need */ } = module.myAppInit();
// use your refs as you need
});

Compile template files/stubs into JS variables with Laravel mix

I have a folder /resources/js/stubs. In that folder sits a few files, lets say User.stub, Controller.stub and Migration.stub. I would like to use the content of these files in my javascript by doing something like this
import Templates from './Templates.js'
console.log(Templates.User)
// The content of User.stub is printed
I don't want to use this approach
module.exports = name => `
...
`
I could also use the backend to load the files into the view but I would prefer not to.
So then that requires preprocessing. Can I do it with Laravel mix somehow? If not what are my options, what do I need to add to my Laravel app?
partial solution
Thanks to Diogo Sgrillo for pointing out this solution in the comment.
Install raw-loader
yarn add raw-loader --dev
webpack.mix.js
Add this configuration (In my case all the files will be named with a .stub extension.):
mix.webpackConfig({
module: {
rules: [
{
test: /\.stub$/i,
use: 'raw-loader',
},
],
},
});
Also add a separate pipe in webpack.mix.js like this:
mix.js('src/templates/index.js', 'src/templates.js')
It will compile a list of templates in index.js file and put them in templates.js.
src/templates/index.js
// Dynamically load all stubs using raw-loader (see webpack.mix.js)
let stubs = require.context('./', true, /\.stub$/i);
// Create a object with the filename as key and content as value
exports.stubs = stubs.keys().reduce((result, key) => {
return {
[key.replace(/\.\//,'').replace(/\.stub$/,'')] : stubs(key).default,
...result
}
}, {});
export default { /* dont remove this export default or it will break !?? */}
Later it can be used like this:
let templates = require('./templates.js')
console.log(templates['User.php'])
Please comment or add another answer on how to do this more smooth. What is the difference between exports/export? I cant use import with this method, only require and it breaks if I try to export default (or remove export default.

Setting environment variables in Gatsby

I used this tutorial: https://github.com/gatsbyjs/gatsby/blob/master/docs/docs/environment-variables.md
Steps I followed:
1) install dotenv#4.0.0
2) Create two files in root folder: ".env.development" and ".env.production"
3) "follow their setup instructions" (example on dotenv npm docs)
In gatsby-config.js:
const fs = require('fs');
const dotenv = require('dotenv');
const envConfig =
dotenv.parse(fs.readFileSync(`.env.${process.env.NODE_ENV}`));
for (var k in envConfig) {
process.env[k] = envConfig[k];
}
Unfortunately, when i run gatsby develop, NODE_ENV isn't set yet:
error Could not load gatsby-config
Error: ENOENT: no such file or directory, open 'E:\Front-End Projects\Gatsby\sebhewelt.com\.env.undefined'
It works when I set it manually:
dotenv.parse(fs.readFileSync(`.env.development`));
I need environment variables in gatsby-config because I put sensitive data in this file:
{
resolve: `gatsby-source-contentful`,
options: {
spaceId: envConfig.CONTENTFUL_SPACE_ID,
accessToken: envConfig.CONTENTFUL_ACCESS_TOKEN
}
}
How to make it work?
PS: Additional question - As this made me think, I know I shouldn't put passwords and tokens on github, but as netlify builds from github, is there other safe way?
I had a similar issue, I created 2 files in the root ".env.development" and ".env.production" but was still not able to access the env file variables - it was returning undefined in my gatsby-config.js file.
Got it working by npm installing dotenv and doing this:
1) When running gatsby develop process.env.NODE_ENV was returning undefined, but when running gatsby build it was returning 'production' so I define it here:
let env = process.env.NODE_ENV || 'development';
2) Then I used dotenv but specify the filepath based on the process.env.NODE_ENV
require('dotenv').config({path: `./.env.${env}`});
3) Then you can access your variables for your config:
module.exports = {
siteMetadata: {
title: `Gatsby Default Starter`,
},
plugins: [
`gatsby-plugin-react-helmet`,
{
resolve: `gatsby-source-contentful`,
options: {
spaceId: `${process.env.CONTENTFUL_ID}`,
accessToken: `${process.env.CONTENTFUL_TOKEN}`,
},
},
],
}
You should only use env files when you're comfortable checking those into git. For passwords/tokens/etc. add them to Netlify or whatever build tool you use through their dashboard.
These you can access in gatsby-config.js & gatsby-node.js via process.env.ENV_VARIABLE.
You can't access environment variables added this way in the browser however. For this you'll need to use .env.development & .env.production.
I really dislike the .env.production file pattern, our build system sets up and uses env variables and having extra build steps to write those into a file is weird. But Gatsby only whitelists GATSBY_ of the env vars, with no obvious way of adding your own.
But doing that isn't so hard, you can do it by adding something like this in the gatsby-node.js file:
exports.onCreateWebpackConfig = ({ actions, getConfig }) => {
const config = getConfig();
// Allow process.env.MY_WHITELIST_PREFIX_* environment variables
const definePlugin = config.plugins.find(p => p.definitions);
for (const [k, v] of Object.entries(process.env)) {
if (k.startsWith("MY_WHITELIST_PREFIX_")) {
definePlugin.definitions[`process.env.${k}`] = JSON.stringify(v);
}
}
actions.replaceWebpackConfig(config);
};
After doing a few searches, I found that we can set environment variables through netlify website, here are the steps:
Under your own netlify console platform, please go to settings
Choose build & deploy tab (can be found on sidebar)
Choose environment sub-tab option
Click edit variables and add/put your credentials in
Done!

Webpack: How can I create a loader for "webpack" which takes an array of dependencies?

For example, I use AMD definition in my project, and use "webpack" for project building. It's possible to create some loader which will take a dependencies in array format?
define(
[
'mySuperLoader![./path/dependency-1, ./path/dependency-2, ...]'
],
function() {
// ... some logic here
}
)
Project example: gitHub
If you want to port the load-plugin's behavior to webpack, you need to do this:
1. Create a custom resolver
This is because mySuperLoader![./path/dependency-1, ./path/dependency-2, ...] does not point to a single file. When webpack tries to load a file, it first:
resolves the file path
loads the file content
matches and resolves all loaders
passes the file content to the loader chain
Since [./path/dependency-1, ./path/dependency-2, ...] is not a proper file path, there is some work to do. It is even not a proper JSON.
So, our first goal is to turn this into mySuperLoader!some/random/file?["./path/dependency-1", "./path/dependency-2", ...]. This is usually done by creating a custom resolver:
// webpack.config.js
var customResolverPlugin = {
apply: function (resolver) {
resolver.plugin("resolve", function (context, request) {
const matchLoadRequest = /^\[(.+)]$/.exec(request.path);
if (matchLoadRequest) {
request.query = '?' + JSON.stringify(
matchLoadRequest[1]
.split(", ")
);
request.path = __filename;
}
});
}
};
module.exports = {
...
plugins: [
{
apply: function (compiler) {
compiler.resolvers.normal.apply(customResolverPlugin);
}
}
]
};
Notice request.path = __filename;? We just need to give webpack an existing file so that it does not throw an error. We will generate all the content anyway. Probably not the most elegant solution, but it works.
2. Create our own load-loader (yeah!)
// loadLoader.js
const path = require("path");
function loadLoader() {
return JSON.parse(this.request.match(/\?(.+?)$/)[1])
.map(module =>
`exports['${path.basename(module, '.js')}'] = require('${module}');`
)
.join('\n');
}
module.exports = loadLoader;
This loader parses the request's query we have re-written with our custom resolver and creates a CommonJS module that looks like this
exports['dependency-1'] = require('path/to/dependency-1');
exports['dependency-2'] = require('path/to/dependency-2');
3. Alias our own load-loader
// webpack.config.js
...
resolveLoader: {
alias: {
load: require.resolve('./loadLoader.js')
}
},
4. Configure root
Since /path/to/dependency-1 is root-relative, we need to add the root to the webpack config
// webpack.config.js
resolve: {
root: '/absolute/path/to/root' // usually just __dirname
},
This is neither a beautiful nor an ideal solution, but should work as a makeshift until you've ported your modules.
I don't think that you should use a loader for that. Why don't you just write:
require("./path/dependency-1");
require("./path/dependency-2");
require("./path/dependency-3");
It accomplishes the same thing, is much more expressive and requires no extra code/loader/hack/configuration.
If you're still not satisfied, you might be interested in webpack contexts which allow you to require a bulk of files that match a given filter. So, if you write
require("./template/" + name + ".jade");
webpack includes all modules that could be accessed by this expression without accessing parent directories. It's basically the same like writing
require("./table.jade");
require("./table-row.jade");
require("./directory/folder.jade")
You can also create contexts manually like this
var myRequire = require.context(
"./template", // search inside this directory
false, // false excludes sub-directories
/\.jade$/ // use this regex to filter files
);
var table = myRequire("./table.jade");

Categories

Resources