Package.json variable not being picked up - javascript

I have the following functionality that locally loads some products out of stock depending on the command run in the terminal:
const updates = process.env.option === "oos" ? oosUpdates : insUpdates
log.info("updates: ", JSON.stringify(updates))
const result = await loadUpdateVariants(cache, updates as UpdateType[])
In my package.json I have:
"update:oos": "ts-node src/loaders/mock/update-variants.ts --option oos",
"update:ins": "ts-node src/loaders/mock/update-variants.ts --option ins"
however when I run yarn update:oos I am getting the instock products meaning that this process.env.option === "oos" is not working out correctly.

process.env gives you access to environment variables. If you want to use environment variables, you can set them like this:
"update:oos": "OPTION=oos ts-node src/loaders/mock/update-variants.ts",
"update:ins": "OPTION=ins ts-node src/loaders/mock/update-variants.ts"
This is the "vanilla" (without any extra dependencies) approach - but may encounter issues on Windows. If you want your command to work well cross-platform, you might consider using cross-env

Related

Fail Gatsby build if environment variable missing

I have experimented with adding environment variables to my Gatsby project using .env.development and .env.production files and it's working great.
I would like to have my builds fail if one of the environment variables is missing, however I can't seem to see how to enable this functionality.
I have read through the Gatsby environment variables documentation, but can't seem to see how this would work? is this possible?
I believe it uses dotenv/webpack define plugin under the hood.
I’m sure there are other ways to do this, but with some quick tests, this approach seems to be working well for me.
In your gatsby-config.js file, you can choose to explicitly require the dotenv, so you can use those environment variables in your config.
I added the following, and now the Gatsby build will fail unless the specified environment variables are present.
// Load the environment variables, per
// https://www.gatsbyjs.org/docs/environment-variables/#server-side-nodejs
require('dotenv').config({
path: `.env.${process.env.NODE_ENV}`,
})
function checkEnv(envName) {
if (typeof process.env[envName] === 'undefined' || process.env[envName] === '') {
throw `Missing required environment variables: ${envName}`
}
}
try {
checkEnv('NODE_ENV')
checkEnv('EXAMPLE_MISSING_ENV')
checkEnv('EXAMPLE_API_KEY')
} catch (e) {
throw new Error(e)
}
// The rest of the config file
I could imagine customizing this further, ex. logging a warning for a variable with a fallback versus throwing an error for one that is required by your content sourcing plugin or theme. Hope this is helpful as a starting point!
I couldn't find built-in solution for this on Gatsby neither. You may do it manually, but still not too easy.
First problem: If you wanna load your environment from file while running npm script; it can not be loaded right away. But you may trigger a script file, and it can load this environment variables before your check.
lets say build.sh on root directory of project :
source ./.env.development # this line will set env variables
if [ "$API_KEY" = 927349872349798 ] ; then
npm run build
fi
Another problem rises; some developers might want to run it on windows maybe. So better use famous cross-env package.
npm i cross-env
Then everything is ready, add your secure-build :
"scripts": {
"build": "gatsby build",
"develop": "gatsby develop",
"format": "prettier --write \"**/*.{js,jsx,json,md}\"",
"start": "npm run develop",
"serve": "gatsby serve",
"clean": "gatsby clean",
"test": "echo \"Write tests! -> https://gatsby.dev/unit-testing\" && exit 1",
"secure-build": "cross-env-shell \"./build.sh\""
},
And run it :
npm run secure-build
This solution looks too much for me as we created a build.sh and install a new package. Maybe there is cleaner solution. I am not Gatsby Guru after all.
I added env checking to the onPreInit life cycle hook in gatsby-node.ts:
const envVariablesList = [
"ENV1",
"ENV2",
"ENV3",
];
function envVarChecker(vars: string[]): string | undefined {
return vars.find(
(item) => process.env[item] === undefined || process.env[item] === ""
);
}
export const onPreInit: GatsbyNode["onPreInit"] = ({ actions }) => {
const emptyEnv = envVarChecker(envVariablesList);
if (emptyEnv !== undefined) {
throw new Error(`Env variable: ${emptyEnv} is empty!`);
}
};
It fails build almost at the very beginning (during pre-bootstrap phase) if any of the declared variables is missing

How to populate author, branch, commit, message in Cypres Dashboard in CI with Jenkins?

I'm trying to get the commit information details in the Cypress Dashboard. I haven't been able to accomplish just yet, but I have made some advances though...
I'll describe what I have done so far:
Installed the commit-info npm package by running the command:
npm install --save #cypress/commit-info
Import the plugin in plugin/index.js file like so:
const { commitInfo } = require('#cypress/commit-info');
module.exports = on => {
on('file:preprocessor', file => {
commitInfo().then(console.log);
});
};
Now I get all the information, author, branch, commit & message, in the terminal!:)
However, I still don't have the information details linked to my Cypress Dashboard.
This is what I currently get:
What're the next steps? The documentation is not clear to me...
In our case we run everything inside a docker container. We copy our code into the container but do not copy the .git directory, it's large, time consuming, and we don't need it. #cypress/commit-info assumes there is a .git directory, so since there isn't, it doesn't work.
We overcame this by setting the values cypress expects explicitly in the cypress run command in our Jenkinsfile:
def commitMessage = sh(script:"git log --format=%B -n 1 ${env.GIT_COMMIT}", returnStdout:true).trim()
def commitAuthor = sh(script:"git log --format='%an' -n 1 ${env.GIT_COMMIT}", returnStdout:true).trim()
def commitEmail = sh(script:"git log --format='%ae' -n 1 ${env.GIT_COMMIT}", returnStdout:true).trim()
def cypressVars = "COMMIT_INFO_BRANCH=${env.GIT_BRANCH} COMMIT_INFO_SHA=${env.GIT_COMMIT} COMMIT_INFO_REMOTE=${env.GIT_URL} COMMIT_INFO_MESSAGE=\"${commitMessage}\" COMMIT_INFO_AUTHOR=\"${commitAuthor}\" COMMIT_INFO_EMAIL=${commitEmail}"
// call cypress however you do and include cypressVars as part of the command

Bazel: How do i use nodeJS_binary rule to do "npm run start"

How do i use the nodejs_binary rule to do a standard npm run start. I am able to run a typical node project using this rule. However i want to run a the start script in package.json. So far i have the following below in my build file
load("#build_bazel_rules_nodejs//:defs.bzl", "nodejs_binary")
nodejs_binary(
name = "app",
data = [":app_files"],
node="#nodejs//:bin/npm",
entry_point = "workspace_name/src/server.js",
node_modules = "#npm_deps//:node_modules",
args=["start"]
)
This does not start the server..somehow npm command is not running properly. it indicates usage of the command in incomplete.
I am currently able to do this within the WORKSPACE
bazel run #nodejs//:bin/yarn (runs yarn install and installs all node-modulse)
bazel run #nodejs//:bin/npm start (this starts the server)
In my package.json i have
{
"scripts": {
"start": "babel-node src/server.js",
...
}
...
}
I do i get this to work with nodejs_binary rule and subsequently node_image
I changed from using npm to using yarn..workspace_name/src/server.js.. is called now but Then i had different set of problems, babel-node was not found.
I modified the rule a bit. After careful study...I realise that there is a dependency on babel-node that is not satisfied at the time yarn run start is called. The following worked after i had run bazel run #nodejs//:bin/yarn before running the rule.
nodejs_binary(
name = "app",
args = ["start"],
data = [
":app_files",
"#//:node_modules",
],
entry_point = "workspace_name/src/server.js",
node = "#nodejs//:bin/yarn",
node_modules = "#npm_deps//:node_modules",
)
It appears that "#//:node_modules" solves the babel-node dependency issue. So the rule above does not work on its own...it needs me to do bazel run #nodejs//:bin/yarn (more like npm/yarn install to make the node_modules, which contain babel-node dependecy available when npm/yarn start is run)
So my problem is that I do not want to have to manually run bazel run #nodejs//:bin/yarn before executing my rule. how do i do this.
I suppose it would work if i stopped depending on babel-node...but then i would have to change my code to not use es6 syntax (that is a hustle). Is there a way i can do this with a genrule? or something...
What I ended up doing was that i made a babel nodejs_binary rule. Then used that to compile my source files in a gen rule
# Make babel binary
nodejs_binary(
name = "babel",
entry_point = "npm_deps/node_modules/babel-cli/bin/babel",
install_source_map_support = False,
node_modules = "#npm_deps//:node_modules",
)
# Compile source files with babel
genrule(
name = "compiled_src",
srcs = [
":src_files",
],
outs = ["src"],
cmd = "$(location babel) src --out-dir $#",
tools = [":babel"],
)
Note that in this case src in cmd = "$(location babel) src --out-dir $#" is a folder in the :src_files filegroup.
filegroup(
name = "src_files",
srcs = glob([
"src/**/*",
...
]),
)
After this it was unnecessary to use npm start, just used default node. I could just do
nodejs_binary(
name = "app",
data = [":compiled_src"],
entry_point = "workspace_name/src/server.js",
node_modules = "#npm_deps//:node_modules",
)

Test two different npm package versions at the same time

When I create an npm package, sometimes it would face the need to backward old dependency package version.
If the new version has new api, I may write the code in this pattern:
import pkg from 'some-pkg';
const isNewVersion = pkg.newVersionApi !== 'undefined';
if (isNewversion) {
pkg.newVersionApi();
} else {
pkg.oldVersionApi(); // backward compatible api
}
And with this pattern, when I want to write the test, I only can test the installed version code. The other version's code can't be tested.
For real example, in React v15 and v16, React v16 has new API Portal. Before Portal release, v15 has unstable_renderSubtreeIntoContainer api to realize similar feature.
So the code for React would be like:
import ReactDOM from 'react-dom';
const isV16 = ReactDOM.createPortal !== 'undefined';
if (isV16) {
ReactDOM.createPortal(...);
} else {
ReactDOM.unstable_renderSubtreeIntoContainer(...);
}
So I want to ask is there any method to test with different dependency version?
Currently, one method I think of is to install the other version again and test it. But it only can do on local. It can't work on ci and it can't count in coverage together.
I think that is not only for react test. It may face on node.js test. Any suggestion can be discussed.
Updated
This question maybe is related to install two versions dependency in npm. But I know currently installing two versions dependency is not workable.
Here is a might be solution, not sure it will work as you expect. But, you will have a direction to move forward.
package.json
{
"name": "express-demo",
"version": "0.0.0",
"private": true,
"scripts": {
"start": "node ./bin/www"
},
"dependencies": {
"cookie-parser": "~1.4.3",
"debug": "~2.6.3",
"express": "~4.15.2",
"jade": "~1.11.0",
"morgan": "~1.8.1",
"serve-favicon": "~2.4.2",
"webpack": "^3.8.1",
"webpack-dev-middleware": "^1.12.0",
"webpack-hot-middleware": "^2.20.0"
},
"customDependecies": {
"body-parser": [
"",
"1.18.1",
"1.18.0"
]
}
}
Note in above package.json file, I have added a new key customDependecies which I will use for installing multiple dependencies. Here I am using body-parser package for demo. Next you need file, that can read this key and install the deps.
install-deps.js
const {spawnSync} = require('child_process');
const fs = require('fs');
const customDependencies = require('./package.json').customDependecies;
spawnSync('mkdir', ['./node_modules/.tmp']);
for (var dependency in customDependencies) {
customDependencies[dependency].forEach((version) => {
console.log(`Installing ${dependency}#${version}`);
if (version) {
spawnSync('npm', ['install', `${dependency}#${version}`]);
spawnSync('mv', [`./node_modules/${dependency}`, `./node_modules/.tmp/${dependency}#${version}`]);
} else {
spawnSync('npm', ['install', `${dependency}`]);
spawnSync('mv', [`./node_modules/${dependency}`, `./node_modules/.tmp/${dependency}`]);
}
});
customDependencies[dependency].forEach((version) => {
console.log(`Moving ${dependency}#${version}`);
if (version) {
spawnSync('mv', [`./node_modules/.tmp/${dependency}#${version}`, `./node_modules/${dependency}#${version}`]);
} else {
spawnSync('mv', [`./node_modules/.tmp/${dependency}`, `./node_modules/${dependency}`]);
}
});
}
spawnSync('rm', ['-rf', './node_modules/.tmp']);
console.log(`Installing Deps finished.`);
Here, I am installing deps one by one in tmp folder and once installed, I am moving them to ./node_modules folder.
Once, everything is installed, you can check the versions like below
index.js
var bodyParser = require('body-parser/package.json');
var bodyParser1181 = require('body-parser#1.18.1/package.json');
var bodyParser1182 = require('body-parser#1.18.0/package.json');
console.log(bodyParser.version);
console.log(bodyParser1181.version);
console.log(bodyParser1182.version);
Hope, this will serve your purpose.
Create 3 separate projects (folders with package.json) and a shared folder:
A shared folder containing the test module (my-test). Export a function to run the test;
A client project importing my-test and dependency v1. Export a function that calls the test function in my-test.
A client project importing my-test and dependency v2. Export a function that calls the test function in my-test.
A master project that imports both client projects. Run each exported function.
You're going to have to run them separately. Create a separate project folder for each dependency version. Ex. React10, React11, React12. Each will have its own package.json, specified for the correct version. When you run the integration and/or versioned tests, you'll run your standard unit tests across each version, but it may also be advisable to add any version specific unit tests to that folder.
Creating a make file would make your life easier when running your full testing suite. If you do this you can easily integrate this into CI.
In addition to the other suggestions, you can try the approach described at Testing multiple versions of a module dependency.
Here's an example of using this approach to test against multiple webpack versions:
npm install --save-dev module-alias
npm install --save-dev webpack-v4#npm:webpack#4.46.0
npm install --save-dev webpack-v5#npm:webpack#5.45.1
The module-alias package handles the magic of switching between package versions while still supporting normal require('webpack') (or whatever your module is) calls.
The other installs will create two versions of your dependency, each with a distinct directory name within your local node_modules/.
Then, within your test code, you can set up the dependency alias via:
const path = require('path');
require('module-alias').addAlias(
'webpack',
path.resolve('node_modules', 'webpack-v4'),
);
// require('webpack') will now pull in webpack-v4
You'd do the same thing for 'webpack-v5' in a different test harness.
If any of your sub-dependencies have a hardcoded require('webpack') anywhere in their own code, this will ensure that they also pull in the correct webpack version.

How do I load my script into the node.js REPL?

I have a script foo.js that contains some functions I want to play with in the REPL.
Is there a way to have node execute my script and then jump into a REPL with all the declared globals, like I can with python -i foo.py or ghci foo.hs?
There is still nothing built-in to provide the exact functionality you describe. However, an alternative to using require it to use the .load command within the REPL, like such:
.load foo.js
It loads the file in line by line just as if you had typed it in the REPL. Unlike require this pollutes the REPL history with the commands you loaded. However, it has the advantage of being repeatable because it is not cached like require.
Which is better for you will depend on your use case.
Edit: It has limited applicability because it does not work in strict mode, but three years later I have learned that if your script does not have 'use strict', you can use eval to load your script without polluting the REPL history:
var fs = require('fs');
eval(fs.readFileSync('foo.js').toString())
i always use this command
node -i -e "$(< yourScript.js)"
works exactly as in Python without any packages.
I made Vorpal.js, which handles this problem by turning your node add into an interactive CLI. It supports a REPL extension, which drops you into a REPL within the context of your running app.
var vorpal = require('vorpal')();
var repl = require('vorpal-repl');
vorpal
.delimiter('myapp>')
.use(repl)
.show()
.parse(process.argv);
Then you can run the app and it will drop into a REPL.
$ node myapp.js repl
myapp> repl:
Another way is to define those functions as global.
global.helloWorld = function() { console.log("Hello World"); }
Then preload the file in the REPL as:
node -r ./file.js
Then the function helloWorld can be accessed directly in the REPL.
Here's a bash function version of
George's answer:
noderepl() {
FILE_CONTENTS="$(< $1 )"
node -i -e "$FILE_CONTENTS"
}
If you put this in your ~/.bash_profile you can use it like an alias, i.e.:
noderepl foo.js
I created replpad since I got tired of reloading the script repeatedly.
Simply install it via: npm install -g replpad
Then use it by running: replpad
If you want it to watch all files in the current and all subdirectories and pipe them into the repl when they change do: replpad .
Check out the videos on the site to get a better idea of how it works and learn about some other nice features that it has like these:
access core module docs in the repl via the dox() function that is added to every core function, i.e. fs.readdir.dox()
access user module readmes in the repl via the dox() function that is added to every module installed via npm,
i.e. marked.dox()
access function's highlighted source code, info on where function was defined (file, linenumber) and function
comments and/or jsdocs where possible via the src property that is added to every function, i.e. express.logger.src
scriptie-talkie support (see .talk command)
adds commands and keyboard shortcuts
vim key bindings
key map support
parens matching via match token plugin
appends code entered in repl back to file via keyboard shortcut or .append command
See: https://github.com/thlorenz/replpad
Why not load the file into an interactive node repl?
node -h
-e, --eval script evaluate script
-i, --interactive always enter the REPL even if stdin
node -e 'var client = require("./build/main/index.js"); console.log("Use `client` in repl")' -i
Then you can add to package.json scripts
"repl": "node -e 'var client = require(\"./build/main/index.js\"); console.log(\"Use `client` in repl\")' -i",
tested using node v8.1.2
Currently you can't do that directly, but you can mylib = require('./foo.js') in the REPL. Remember methods are exported, not declared as globals.
replpad is cool, but for a quick and easy way to load a file into node, import its variables and start a repl, you can add the following code to the end of your .js file
if (require.main === module){
(function() {
var _context = require('repl').start({prompt: '$> '}).context;
var scope = require('lexical-scope')(require('fs').readFileSync(__filename));
for (var name in scope.locals[''] )
_context[scope.locals[''][name]] = eval(scope.locals[''][name]);
for (name in scope.globals.exported)
_context[scope.globals.exported[name]] = eval(scope.globals.exported[name]);
})();
}
Now if your file is src.js, running node src.js will start node, load the file, start a REPL, and copy all the objects declared as var at the top level as well as any exported globals.
The if (require.main === module) ensures that this code will not be executed if src.js is included through a require statement. I fact, you can add any code you want to be excuted when you are running src.js standalone for debugging purposes inside the if statement.
Another suggestion that I do not see here: try this little bit of code
#!/usr/bin/env node
'use strict';
const repl = require('repl');
const cli = repl.start({ replMode: repl.REPL_MODE_STRICT });
cli.context.foo = require('./foo'); // injects it into the repl
Then you can simply run this script and it will include foo as a variable
Old answer
type test.js|node -i
Will open the node REPL and type in all lines from test.js into REPL, but for some reason node will quit after file ends
Another problem is, that functions will not be hoisted.
Better answer
node -e require('repl').start({useGlobal:true}); -r ./test2.js
Then all globals declared without var within test2.js will be available in the REPL
not sure why var a in global scope will not be available
There is an Official Node.js REPL that supports also async methods
console.js
const repl = require('repl')
const mongoose = require('mongoose')
const run = async () => {
await mongoose.connect(process.env.DB_URL, {
useNewUrlParser: true,
useUnifiedTopology: true
})
const r = repl.start(`(${process.env.NODE_EN}) ⚡️ `)
r.context.User = require('./src/models/user.model')
r.context.mongoose = mongoose
console.log(`Ready 🚀`);
}
run()
Start the console:
NODE_OPTIONS=--experimental-repl-await node console.js
User model its exposed to console
await User.find({})
source

Categories

Resources