Within my Angular app , and while building it , i ve to catch the app version , which is inside my package.json file and copy it to another file which would be not a json file.
My package.json file :
{
"name": "my app",
"author": "ME",
"version": "0.1.3",
"scripts": {
"ng": "ng",
"start": "ng serve --public-host --port 4222 http://localhost:4222/",
"build": "ng build",
"sync-version": "sync-json -v --property version --source package.json projects/cockpit-tools/package.json && sync-json -v --property version --source package.json projects/cockpit-tools-demo/src/assets/configuration/properties.json",
"sync-custom": "shx node -p -e \"require('./package.json').version\"",
"copy-version":"MY_COMMAND"
...
}
My target file is properties file whih looks like this , and which is located in same path of package.json (this file is used in another level of automation , that's why i can't change it)
app-infos.properties
project.version=0.0.0
project.author=ME
My purpose is to write a command line task script which i may run to synchronize / copy the version from package.json to the project.version property inside my target file.
I ve tried an npm library called sync-version , but that works only when the target is a json file
I ve tried also to do it with shell command line within the shx package. also that didn't work as expected.
i ve also tried some node.js tricks (require ...) but the problem persists
i need to find a way to do it with the simpliest way and not be obliged to install some linux tools or similar(jq or other) , since this would run in some CI CD contexts , so that would rather to be as native and as generic as possible.
Suggestions ?
A classic DevOps situation :)
CI/CD usually happens on linux envs.
You will probably have plain shell/bash.
Try extractinf the version to a variable:
myVersion=`grep 'version:' package.json | awk '{print $2}'`
then replacing it in the apps-info:
sed -i "s/0.0.0/$myVersion/" apps-info.properties
We can make ot more generic by always pulling the last version instead of 0.0.0,
but i think you should have a place holder like 0.0.0
How to set some environment variables from within package.json to be used with npm start like commands?
Here's what I currently have in my package.json:
{
...
"scripts": {
"help": "tagove help",
"start": "tagove start"
}
...
}
I want to set environment variables (like NODE_ENV) in the start script while still being able to start the app with just one command, npm start.
Set the environment variable in the script command:
...
"scripts": {
"start": "node app.js",
"test": "NODE_ENV=test mocha --reporter spec"
},
...
Then use process.env.NODE_ENV in your app.
Note: This is for Mac & Linux only. For Windows refer to the comments.
Just use NPM package cross-env. Super easy. Works on Windows, Linux, and all environments. Notice that you don't use && to move to the next task. You just set the env and then start the next task. Credit to #mikekidder for the suggestion in one of the comments here.
From documentation:
{
"scripts": {
"build": "cross-env NODE_ENV=production OTHERFLAG=myValue webpack --config build/webpack.config.js"
}
}
Notice that if you want to set multiple global vars, you just state them in succession, followed by your command to be executed.
Ultimately, the command that is executed (using spawn) is:
webpack --config build/webpack.config.js
The NODE_ENV environment variable will be set by cross-env
I just wanted to add my two cents here for future Node-explorers. On my Ubuntu 14.04 the NODE_ENV=test didn't work, I had to use export NODE_ENV=test after which NODE_ENV=test started working too, weird.
On Windows as have been said you have to use set NODE_ENV=test but for a cross-platform solution the cross-env library didn't seem to do the trick and do you really need a library to do this:
export NODE_ENV=test || set NODE_ENV=test&& yadda yadda
The vertical bars are needed as otherwise Windows would crash on the unrecognized export NODE_ENV command. I don't know about the trailing space, but just to be sure I removed them too.
Because I often find myself working with multiple environment variables, I find it useful to keep them in a separate .env file (make sure to ignore this from your source control). Then (in Linux) prepend export $(cat .env | xargs) && in your script command before starting your app.
Example .env file:
VAR_A=Hello World
VAR_B=format the .env file like this with new vars separated by a line break
Example index.js:
console.log('Test', process.env.VAR_A, process.env.VAR_B);
Example package.json:
{
...
"scripts": {
"start": "node index.js",
"env-linux": "export $(cat .env | xargs) && env",
"start-linux": "export $(cat .env | xargs) && npm start",
"env-windows": "(for /F \"tokens=*\" %i in (.env) do set %i)",
"start-windows": "(for /F \"tokens=*\" %i in (.env) do set %i) && npm start",
}
...
}
Unfortunately I can't seem to set the environment variables by calling a script from a script -- like "start-windows": "npm run env-windows && npm start" -- so there is some redundancy in the scripts.
For a test you can see the env variables by running npm run env-linux or npm run env-windows, and test that they make it into your app by running npm run start-linux or npm run start-windows.
Try this on Windows by replacing YOURENV:
{
...
"scripts": {
"help": "set NODE_ENV=YOURENV && tagove help",
"start": "set NODE_ENV=YOURENV && tagove start"
}
...
}
#luke's answer was almost the one I needed! Thanks.
As the selected answer is very straightforward (and correct), but old, I would like to offer an alternative for importing variables from a .env separate file when running your scripts and fixing some limitations to Luke's answer.
Try this:
::: .env file :::
# This way, you CAN use comments in your .env files
NODE_PATH="src/"
# You can also have extra/empty lines in it
SASS_PATH="node_modules:src/styles"
Then, in your package json, you will create a script that will set the variables and run it before the scripts you need them:
::: package.json :::
scripts: {
"set-env": "export $(cat .env | grep \"^[^#;]\" |xargs)",
"storybook": "npm run set-env && start-storybook -s public"
}
Some observations:
The regular expression in the grep'ed cat command will clear the comments and empty lines.
The && don't need to be "glued" to npm run set-env, as it would be required if you were setting the variables in the same command.
If you are using yarn, you may see a warning, you can either change it to yarn set-env or use npm run set-env --scripts-prepend-node-path && instead.
Different environments
Another advantage when using it is that you can have different environment variables.
scripts: {
"set-env:production": "export $(cat .production.env | grep \"^[^#;]\" |xargs)",
"set-env:development": "export $(cat .env | grep \"^[^#;]\" |xargs)",
}
Please, remember not to add .env files to your git repository when you have keys, passwords or sensitive/personal data in them!
UPDATE: This solution may break in npm v7 due to npm RFC 21
CAVEAT: no idea if this works with yarn
npm (and yarn) passes a lot of data from package.json into scripts as environment variables. Use npm run env to see them all. This is documented in https://docs.npmjs.com/misc/scripts#environment and is not only for "lifecycle" scripts like prepublish but also any script executed by npm run.
You can access these inside code (e.g. process.env.npm_package_config_port in JS) but they're already available to the shell running the scripts so you can also access them as $npm_... expansions in the "scripts" (unix syntax, might not work on windows?).
The "config" section seems intended for this use:
"name": "myproject",
...
"config": {
"port": "8010"
},
"scripts": {
"start": "node server.js $npm_package_config_port",
"test": "wait-on http://localhost:$npm_package_config_port/ && node test.js http://localhost:$npm_package_config_port/"
}
An important quality of these "config" fields is that users can override them without modifying package.json!
$ npm run start
> myproject#0.0.0 start /home/cben/mydir
> node server.js $npm_package_config_port
Serving on localhost:8010
$ npm config set myproject:port 8020
$ git diff package.json # no change!
$ cat ~/.npmrc
myproject:port=8020
$ npm run start
> myproject#0.0.0 start /home/cben/mydir
> node server.js $npm_package_config_port
Serving on localhost:8020
See npm config and yarn config docs.
It appears that yarn reads ~/.npmrc so npm config set affects both, but yarn config set writes to ~/.yarnrc, so only yarn will see it :-(
For a larger set of environment variables or when you want to reuse them you can use env-cmd.
As a plus, the .env file would also work with direnv.
./.env file:
# This is a comment
ENV1=THANKS
ENV2=FOR ALL
ENV3=THE FISH
./package.json:
{
"scripts": {
"test": "env-cmd mocha -R spec"
}
}
This will work in Windows console:
"scripts": {
"setAndStart": "set TMP=test&& node index.js",
"otherScriptCmd": "echo %TMP%"
}
npm run aaa
output:
test
See this answer for details.
suddenly i found that actionhero is using following code, that solved my problem by just passing --NODE_ENV=production in start script command option.
if(argv['NODE_ENV'] != null){
api.env = argv['NODE_ENV'];
} else if(process.env.NODE_ENV != null){
api.env = process.env.NODE_ENV;
}
i would really appreciate to accept answer of someone else who know more better way to set environment variables in package.json or init script or something like, where app bootstrapped by someone else.
use git bash in windows. Git Bash processes commands differently than cmd.
Most Windows command prompts will choke when you set environment variables with NODE_ENV=production like that. (The exception is Bash on Windows, which uses native Bash.) Similarly, there's a difference in how windows and POSIX commands utilize environment variables. With POSIX, you use: $ENV_VAR and on windows you use %ENV_VAR%. - cross-env doc
{
...
"scripts": {
"help": "tagove help",
"start": "env NODE_ENV=production tagove start"
}
...
}
use dotenv package to declare the env variables
For single environment variable
"scripts": {
"start": "set NODE_ENV=production&& node server.js"
}
For multiple environment variables
"scripts": {
"start": "set NODE_ENV=production&& set PORT=8000&& node server.js"
}
When the NODE_ENV environment variable is set to 'production' all devDependencies in your package.json file will be completely ignored when running npm install. You can also enforce this with a --production flag:
npm install --production
For setting NODE_ENV you can use any of these methods
method 1: set NODE_ENV for all node apps
Windows :
set NODE_ENV=production
Linux, macOS or other unix based system :
export NODE_ENV=production
This sets NODE_ENV for current bash session thus any apps started after this statement will have NODE_ENV set to production.
method 2: set NODE_ENV for current app
NODE_ENV=production node app.js
This will set NODE_ENV for the current app only. This helps when we want to test our apps on different environments.
method 3: create .env file and use it
This uses the idea explained here. Refer this post for more detailed explanation.
Basically, you create a .env file and run some bash scripts to set them on the environment.
To avoid writing a bash script, the env-cmd package can be used to load the environment variables defined in the .env file.
env-cmd .env node app.js
method 4: Use cross-env package
This package allows environment variables to be set in one way for every platform.
After installing it with npm, you can just add it to your deployment script in package.json as follows:
"build:deploy": "cross-env NODE_ENV=production webpack"
{
...
"scripts": {
"start": "ENV NODE_ENV=production someapp --options"
}
...
}
Most elegant and portable solution:
package.json:
"scripts": {
"serve": "export NODE_PRESERVE_SYMLINKS_MAIN=1 && vue-cli-service serve"
},
Under windows create export.cmd and put it somewhere to your %PATH%:
#echo off
set %*
If you:
Are currently using Windows;
Have git bash installed;
Don't want to use set ENV in your package.json which makes it only runnable for Windows dev machines;
Then you can set the script shell of node from cmd to git bash and write linux-style env setting statements in package.json for it to work on both Windows/Linux/Mac.
$ npm config set script-shell "C:\\Program Files\\git\\bin\\bash.exe"
Although not directly answering the question I´d like to share an idea on top of the other answers. From what I got each of these would offer some level of complexity to achieve cross platform independency.
On my scenario all I wanted, originally, to set a variable to control whether or not to secure the server with JWT authentication (for development purposes)
After reading the answers I decided simply to create 2 different files, with authentication turned on and off respectively.
"scripts": {
"dev": "nodemon --debug index_auth.js",
"devna": "nodemon --debug index_no_auth.js",
}
The files are simply wrappers that call the original index.js file (which I renamed to appbootstrapper.js):
//index_no_auth.js authentication turned off
const bootstrapper = require('./appbootstrapper');
bootstrapper(false);
//index_auth.js authentication turned on
const bootstrapper = require('./appbootstrapper');
bootstrapper(true);
class AppBootStrapper {
init(useauth) {
//real initialization
}
}
Perhaps this can help someone else
Running a node.js script from package.json with multiple environment variables:
package.json file:
"scripts": {
"do-nothing": "set NODE_ENV=prod4 && set LOCAL_RUN=true && node ./x.js",
},
x.js file can be as:
let env = process.env.NODE_ENV;
let isLocal = process.env.LOCAL_RUN;
console.log("ENV" , env);
console.log("isLocal", isLocal);
You should not set ENV variables in package.json. actionhero uses NODE_ENV to allow you to change configuration options which are loaded from the files in ./config. Check out the redis config file, and see how NODE_ENV is uses to change database options in NODE_ENV=test
If you want to use other ENV variables to set things (perhaps the HTTP port), you still don't need to change anything in package.json. For example, if you set PORT=1234 in ENV and want to use that as the HTTP port in NODE_ENV=production, just reference that in the relevant config file, IE:
# in config/servers/web.js
exports.production = {
servers: {
web: function(api){
return {
port: process.env.PORT
}
}
}
}
In addition to use of cross-env as documented above, for setting a few environment variables within a package.json 'run script', if your script involves running NodeJS, then you can set Node to pre-require dotenv/config:
{
scripts: {
"eg:js": "node -r dotenv/config your-script.js",
"eg:ts": "ts-node -r dotenv/config your-script.ts",
"test": "ts-node -r dotenv/config -C 'console.log(process.env.PATH)'",
}
}
This will cause your node interpreter to require dotenv/config, which will itself read the .env file in the present working directory from which node was called.
The .env format is lax or liberal:
# Comments are permitted
FOO=123
BAR=${FOO}
BAZ=Basingstoke Round About
#Blank lines are no problem
Note : In order to set multiple environment variable, script should goes like this
"scripts": {
"start": "set NODE_ENV=production&& set MONGO_USER=your_DB_USER_NAME&& set MONGO_PASSWORD=DB_PASSWORD&& set MONGO_DEFAULT_DATABASE=DB_NAME&& node app.js",
},
I am trying since several days to build a node module which will be used by a react application.
Today my module is packaged as a big javascript file packaged with webpack.
Some part of this node module is optional and is only needed if we want to use certain features (activated through config). In order to avoid loading unecessary part of the code, I have tried first to use chunking and dynamic import as explained by webpack but this has not worked: chunks are created and packaged in the dist/ folder but i never succeed to run the dynamic import in the node module when called from the react app. The application complains that the chunk cannot be loaded from the app. This make sense for me as this webpack feature was more built to load dynamically part of the application and not as an internal mechanism for a node module dependency to load code on demand (but i could be wrong).
I looked at other projects like babel or react-router where the module is split in several package using lerna and yarn package. So i have tried to build the library with lerna with different packages:
my-library
core/
src/
index.js
another-package/
src/
index.js
I want the index.js file from core to call a method from index.jsfrom another-package only if needed and only if the node-module #my-library/another-package was installed. But i never found the solution to do it.
Is it possible to achieve this with ES5/ES6 with webpack/lerna or did i took the wrong approach?
UPDATE October 27TH
So after several tests, I was able to use Aram solution with a plain HTML/JS (https://github.com/PixelDuck/lerna-webpack/blob/main/a-react-app/src/client/test.html) but the solution is not working with a react app package with webpack https://github.com/PixelDuck/lerna-webpack/blob/main/a-react-app/src/client/App.js.
The code is available there: https://github.com/PixelDuck/lerna-webpack.
Open a terminal to my-lerna-library and run
yarn install
yarn link:all, this will create symbolic link for each packages
`yarn dev', this will create bundle and listen to changes
then open a new terminal to folder a-react-app:
yarn install
yarn link "my-lerna-library"
yarn link "#my-lerna-library/another-package"
`yarn dev``
a page will be open on http://0.0.0.0:3000 and you will see that module #my-lerna-library/another-package is not found.
If you open http://0.0.0.0:5000/test.html the plainJS test, everything is looking fine.
It seems that the issue is on the webpack side for the react app because when debugging the application i can see that the core package is looking to a library name my_lerna_library__WEBPACK_IMPORTED_MODULE_3__ which is not the one used by webpack when loading the other package _my_lerna_library_another_package__WEBPACK_IMPORTED_MODULE_2___default
__webpack_require__.r(__webpack_exports__);
/* harmony import */ var react__WEBPACK_IMPORTED_MODULE_0__ = __webpack_require__(/*! react */ "./node_modules/react/index.js");
/* harmony import */ var _App_css__WEBPACK_IMPORTED_MODULE_1__ = __webpack_require__(/*! ./App.css */ "./src/client/App.css");
/* harmony import */ var _App_css__WEBPACK_IMPORTED_MODULE_1___default = /*#__PURE__*/__webpack_require__.n(_App_css__WEBPACK_IMPORTED_MODULE_1__);
/* harmony import */ var _my_lerna_library_another_package__WEBPACK_IMPORTED_MODULE_2__ = __webpack_require__(/*! #my-lerna-library/another-package */ "../my-lerna-library/packages/another-package/dist/index.bundle.js");
/* harmony import */ var _my_lerna_library_another_package__WEBPACK_IMPORTED_MODULE_2___default = /*#__PURE__*/__webpack_require__.n(_my_lerna_library_another_package__WEBPACK_IMPORTED_MODULE_2__);
/* harmony import */ var my_lerna_library__WEBPACK_IMPORTED_MODULE_3__ = __webpack_require__(/*! my-lerna-library */ "../my-lerna-library/packages/core/dist/index.bundle.js");
/* harmony import */ var my_lerna_library__WEBPACK_IMPORTED_MODULE_3___default = /*#__PURE__*/__webpack_require__.n(my_lerna_library__WEBPACK_IMPORTED_MODULE_3__);
In your my-lerna-library > package:-
(1) another-package package.json file replace with this code
{
"name": "#my-lerna-library/another-package",
"private": true,
"version": "1.0.0",
"scripts": {
"dev": "webpack --watch --devtool inline-source-map --mode development",
"build": "webpack --mode production"
},
"dependencies": {
"#my-lerna-library/core": "1.0.0"
}
}
and
(2) core package.json file replace with this code
{
"name": "#my-lerna-library/core",
"private": true,
"version": "1.0.0",
"scripts": {
"dev": "webpack --watch --devtool inline-source-map --mode development",
"build": "webpack --mode production"
}
}
and my-lerna-library package.json file replace workspaces like this,
"workspaces": {
"packages": [
"packages/**"
]
},
Then run command "yarn run dev". I thing this is work for you!!
Finally i was able to achieve what i want.
Code here: https://github.com/PixelDuck/lerna-webpack
So the idea was to set another-package webpack config to output library as 'global' and on core package, looking at this global variable with
function isMyLernaLibraryAnotherPackageDefined() {
return typeof myLernaLibraryAnotherPackage !== 'undefined';
}
testFromAnotherPackage() {
if (isMyLernaLibraryAnotherPackageDefined())
return new myLernaLibraryAnotherPackage.AnotherClass().test();
}
Now on the react app, if i am importing import '#my-lerna-library/another-package';then message and svg are displayed.
If i am commenting this line, module is not found and nothing is displayed
If you need a package with optional dependencies you can create a core package together with several feature packages (which are not even listed in peerDependencies). You may explicitly list those feature packages in optionalDependencies.
Also, you core package should be designed so that it does not necessary require feature packages but can correctly work with them if they're installed. For example this way https://stackoverflow.com/a/50841764/14451484.
The dev build works fine, with my react app on port 3000 and server on port 8080. The front end is able to make requests to the backend and get the response.
But when deploying the production build, the application starts on port 5000 and the static files are rendered.But the front end can't access localhost:8080/api/:id because that port isn't 'on'.
Any help will be appreciated.
You can use dotenv to define custom config variables and cross-env package to define an environment for your react application. First install these packages.
yarn add dotenv && yarn add cross-env --dev
or
npm i dotenv && npm i cross-env --dev
You need create two separate .env config files, name them as .env.development and .env.production. Your env file may contain something like this:
.env.development
API_URL="localhost:8080"
.env.production
API_URL="localhost:5000"
Now on your react's main.js file(or what you have named), just import the dotenv package as follows:
// other imports here
require('dotenv').config({
path: process.env.NODE_ENV === 'production' ? '/path/to/.env.production' : '/path/to/.env.development'
})
Now, on your package.json, change the start and build script as such:
{
...
"scripts": {
"start": "cross-env NODE_ENV=development react-scripts start",
"build": "cross-env NODE_ENV=production react-scripts build",
...
}
...
}
Finally, wherever you have been using the API URL:
Example
axios.post(`${process.env.API_URL}/your/path`) // or any other way to join URL and path
.then((response) => {
console.log(response);
})
.catch((error) => {
console.log(error);
});
Hope it helps to solve your issue.
From all your routes remove localhost:8080 and make direct request to /api/ because everything is running under one PORT which is 5000. You might need to re-build your front-end with changes made
I am trying out the modern approach to build javascript applications without Grunt or Gulp. I am creating my build utilities by just using the scripts key word in package.json.
It works great, but I ran into a challenge. Is there a good way to create separate vendor.js and app.js bundles without making every dependency explicit in the browserify command (or alternatively passing a list of deps to the browserify command)?
Something better than:
"dependencies": {
"react": "latest",
"react-dom": "latest",
"redux": "latest",
"d3": "latest"
},
"devDependencies": {
"browserify": "latest"
},
"scripts": {
"vendor": "browserify -r react -r react-dom -r redux -r d3 > vendor.js",
"app": "browserify -x react -x react-dom -x redux -x d3 ./app/main.js > app.js"
}
Preferable, I would recycle the information stored in the dependencies keyword. Obviously, I do not want bundle devDependencies or dependencies not used in my code (even though the latter can be prevented by good maintenance of the requirements).
Yes, it is possible. Whether the solution is very elegant, I'll leave up to you to decide. Basically it boils down to something like the following (incomplete, browserify not yet called) snippet:
"scripts": {
"init": "npm ls -json --depth 0 | jq .dependencies | jq keys[]",
"vendor": "npm run --silent init | sed 's/\\(.*\\)/-r \\1/g' | xargs"
},
The init script is used to extract the dependencies. The vendor script calls this script, and converts it to the input parameters for browserify.
Note 1: I'm using jq to extract information from the dependencies tree.
Note 2: construction of the argument list can also be done in the init script. You will have to provide an environment variable to distinguish between the -r or -x options.