I'm working on a Windows Store App (JavaScript/HTML/CSS) that will be deployed directly to devices in our enterprise.
I want to keep the datasources (urls to Restful web APIs) as part of the configuration rather than built into the app itself so that I can set them during deployment (e.g. to set test urls and prod urls).
More generally I want to store text variables in config that is external to the app and can be pulled in by the app somehow.
I thought I could set some environment variables or something but Windows Store Apps can't read them it seems.
Any ideas?
You could certainly make an HTTP request from the app on startup to retrieve a configuration file, but that of course assumes connectivity which may or may not work in your scenario. For a Store-acquired app, this is really the only choice.
In your scenario, however, you'll be doing side-loading through a Powershell, correct? (This is implied in installing directly to devices.) In that case, the Powershell script is running in full trust and will have access to the file system during the process. This means that the script can easily deploy a configuration file into the app's local appdata folder, which the app then picks up when it runs. The app package should also contain a default configuration file that it copies into that appdata folder if such a file doesn't exist on startup.
The documentation for the add-appxpackage script that does the install is here: https://technet.microsoft.com/en-us/library/hh856048.aspx.
Another option you might be able to use is to build different versions of your packages for test and production deployment. It is possible to configure the build process in Visual Studio to selectively bring in different versions of a file depending on your build target (e.g. Debug or Release). I have a blog that describes this technique on http://www.kraigbrockschmidt.com/2014/02/25/differentiate-debug-release-builds-javascript/. This would allow you to package different versions of a configuration file into the package, which you'd then read from the package install location at runtime or copy to appdata if you wanted to make changes at runtime.
I mention this method for building different packages because it's something that doesn't need you to do anything other than change the build target. It accomplishes what you would do with #ifdef precompiler directives in other languages, which aren't available for JavaScript.
Related
My aim is to find a way to dynamically pull in environment-specific config values for various 'tracks' of my React app: development, staging and production. The solution most often prescribed involves the use of environment variables, which I don't find great because:
Some of my config values are sensitive data like API secret keys, database passwords, etc and I'd ideally not be keeping these in plain-text both locally and on a CICD system
Having to manually set env vars is error prone and doesn't scale well (it's a big project that has more than 20 config-related key-value pairs to set). It's also difficult to document which env vars need to be set, so it's not a convenient solution for a multi-collaborator team as everyone needs to keep track of the list and copy-paste the values into their local machines for shared API keys, etc (or worse, hard-coding/checking them into the source code)
I have tried the following 2 general approaches:
Use node-config - it looks promising as it's light, flexible, and extensible (it allows defining base values on default.js and overriding them with development.js, staging.js, production.js or with custom env variables). Most importantly, we can store secrets in a remote service (e.g AWS/GCP Secrets Manager, envkey, etc). This solution works well for my Node backend, but so far not for the frontend app built on React
Use dotenv (or dotenv-safe, to allow documenting the structure of .env file in another one .env.example that is checked into source control). This is not my favored approach as dotenv discourages using multiple .env files for each environment our project needs. Secondly, I'd likely still have to find another way to feed in the env variables into my CICD system. Redefining the env vars on the [remote] build system feels like doing the work twice - the first being on the .env files used for local development.
Both approaches yield a familiar problem: TypeError: fs.readFileSync is not a function. According to this related question, it appears that the underlying issue is that the 'fs' module is not designed to work on the browser (both dotenv and node-config are low level modules that use 'fs' under the hood). If we cannot use fs (or rather, modules that rely on it) on the client side: how do scalable/production-grade React projects typically manage config values in a sane way? I know hashicorp/vault exists but it seems a bit of an overkill as we'd likely have to set up our own infrastructure.
I also wonder if there's any open-source tools out there to solve this common problem...
Neither of the two solutions offered above really met my requirements, first because I'm using a create-react-app project so don't have much control over webpack configuration. Secondly, I'd much prefer to not keep .env files locally (let alone in plain text)
Luckily, I came across https://doppler.com/, a universal secrets management solution that solves my needs as described on the OP:
it's a cloud-based secrets store + manager that comes with a CLI, which allows us to use the same env secrets across the entire pipeline (local development, CICD and production)
projects come loaded with development, staging and production environments that makes it easy to switch easily between different flavors of the app
Because Doppler works by injecting environment variables into the runtime, I can run it like so, with yarn:
doppler run -- yarn start
For server environments that need to first inject the env vars into a bundled app (e.g the firebase emulator), first do a 'doppler-injected' build:
doppler run -- yarn build
And then run the emulator as usual:
firebase emulators:start
Using separate dotenv for example, .env.dev , .env.qa would be my current approach to a scalable react project. I am assuming you are using webpack and are facing issues in fetching the files using fs, as separately fs is a node server side module.
To resolve this TypeError issue, in your webpack config, you can use fs to access the current working directory,
const fs = require('fs');
const path = require('path');
const reactAppDirectory = fs.realpathSync(process.cwd());
const resolveReactApp = (relativePath) => path.resolve(reactAppDirectory, relativePath);
You need to copy all your assets and build into a spearate public folder have a web.config and resolve its path using the above resolveReactApp('public') in contentBase key of devServer section of your webpack config.
You can pass the environment variables using webpack's DefinePlugin or EnvironmentPlugin,
new webpack.DefinePlugin({
'process.env.NODE_ENV': JSON.stringify(process.env.NODE_ENV),
'process.env.DEBUG': JSON.stringify(process.env.DEBUG)
});
new webpack.EnvironmentPlugin(['NODE_ENV', 'DEV']);
To store secrets, I would suggest you check out docker-secrets and using a containerized architecture for deployment, and further when the team expands, it'll be easier to get the project setup as well. Here's a quick setup intro of how to use docker with react and more info on switching from environment variables to docker-secrets for a better system architecture.
I'm building SPA with Vue and serve it with nodejs and wrapping it in docker container.
Here is the problem, I'm trying to stick to 12 factor app where for configuration it says keep in env file.
VueJS provides configs for different environment in config folder. But, according to 12 factor app config should not be in files based on environment.
In a twelve-factor app, env vars are granular controls, each fully orthogonal to other env vars. They are never grouped together as “environments”, but instead are independently managed for each deploy.
So how can I access nodejs environment variables in VueJS app?
EDIT:
Thanks for the answers.
The whole idea is to change for example api url on run time trough providing different env variable. If I commit the config file with api url, I would have to rebuild the container on commit and deploy it just for this small change.
I could also have a api access key that is different in dev and prod.
I'm looking for the best way possible to do this kind of things in SPA.
SPA applications nowadays usually go through a build step. This means compiling all of your files into [near to] one dist file and an index.html which may be served statically. This creates a clear separation between front-end (VueJS) and backend (NodeJS). The index.html and js files themselves continue to be static files nonetheless.
This is usually what you want since you can scale server and client independently: Serve static files, say, through s3 + cdn and run your nodejs server independently.
I think what you want is a way to pass runtime configuration to the client. I wouldn't get too caught up on the details of actually sharing the envvars per se.
In your case, I see two possible solutions:
1) Implement an API to access whitelisted envvars from your server - You can think of this as a /config endpoint
2) Render the index.html dynamically via nodejs with something like ejs with the prepopulated envvars - You'll have more coupling between frontend and server but you could extend this to much more than envvars and, say, preopolute the frontend with prefetched data.
Regardless on how you do it, you can consider this runtime configuration for the frontend which should not be attempted to be fixed at build time since otherwise you may be expose sensitive data into static files and it is not always guaranteed that you have all the data at this time.
The way you access the env variables should be the same no matter which OS you are using. The way you set them is different though. On windows I would set an environment variable like so
set PASSWORD=somepassword
And then in the code I can access this variable by doing the following
var pw = process.env.PASSWORD;
You should be able to use this the same way in VueJS.
EDIT:
If you use docker-compose you can set the endpoint on the fly by using environment variables too. Then whenever you docker-compose up the endpoint will be updated with the current value of your environment variable. In your shell update the api endpoint to whatever you want it to be.
set API_URL=http://my-api-endpoint
Then in the docker-compose.yml you would access this value like so
version: '3'
services:
app:
image: 'myapp'
environment:
- API_URL=${API_URL}
You would still access the variable in your code using process.env.API_URL as I mentioned in my example above.
our devops team will use https://github.com/tinou98/vue-12factor
However it's really a big question mark how VueJs does not consider that as a mature frontend/spa framework.
We used to build React Apps since more than 6 years with a built-in support of externalizing env.js file ( create-react-app )
I am willing to make a portable app using HTML CSS JS and similar languages that doesn't need any installation and can be accessed via a browser.
The app should be able to access the file system and create, write and delete files.
The required files will be on the local machine.
I have tried
Applets but the performance is too inconsistent and depends on browsers.
I have also tried using electron but the end result needs installation (correct me if I am wrong)
I am open to all suggestions
Electron's apps can be portable. Copy the electron build resulted folder into a pendrive and execute the main .exe file from there. Everything should work
I have one code base for both Web and NodeWebkit (NW) application.
I use the following stack:
- React
- Hapi
- Sequelize
- Windows environment
Web version of the application uses MySQL, while NW uses Sqlite. It all works fine. I have config file that compiles application for what I need (web or NW).
The problem that I face now is how to deploy the NW application. Idea is to provide NW applicaiton to a client, where he will open it clicking the icon.
Since I use the Node for the NW version, and the application uses many modules which are stored in node_modules, I face a challenge how to pack it all up.
My idea is to make an Windows installer. User will click it and the installer will extract all files to the destination. And also make an icon on the user desktop to run it.
Problem is with the Windows file name limitation. Inside the node_modules, there are many subdirectories that simply violate the Windows limitation. I cant even copy the node_modules folder. I cant even delete it. Well sure I can copy it If I zip it... or remove manually long folders.
I have not yet started working on the installer, but I am thinking I will hit the wall with this approach.
Does anyone have an idea how to make this deployment?
How can I integrate NPM3 in NW?
My plan now is to make Windows installer. That windows installer will install normally application files. The node_modules will be zipped previously and placed inside the installer. Installer will then simply unzip it to the destionation folder.
I will post my progress here.
Some update here.
Main issue here was the depth of the node_modules. I have many modules in node_modules, and after some thinking I figured out there is a simple rule there. Some modules are server side modules, while other ones are used by react.
And since Webpack already creates a huge files in which all of the modules are already included, I simply do not need them at all.
So I have removed all front end side modules(babel modules, react-*), and left only server side (Hapi, sequelize...). Miracle happened, application run and was much faster at the startup.
I am going to use Inno setup to make a manifest file, and it should be good to go.
I am still not out of the danger zone, as developer might need a server side module, which has huge depth. But I will think about that if it happens.
More to follow...
actually in nodejs you can do the following:
1-Create another folder inside your project folder for example "server_modules"
2-In the created folder create another package.json file and install any modules needed for server out there
3-All these modules will be accessible as normal node_modules using require('module_name') and you can delete "server_modules" folder when you package your desktop version if you don't need it
Note: this approach used by some developers to achive micro services in nodejs but it is useful in your case
I have several Chrome Apps that share various assets (CSS, JavaScript, and the like), but it seems that all of the constituent files are required to be in the app folder. I don't want to put these files on a server, because I want the app to be entirely self-contained. I tried OS X aliases, but the Chrome system didn't recognize them in <script> elements.
Obviously, I don't want to maintain multiple copies of these files, as some of them change often during development.
Any ideas short of writing a preprocessor that's run every time a file changes? It would use a file called something like files.json that lists the assets not already in the folder or one of its subfolders.
We currently have experimental support for "shared modules", where one extension or app can depend on a set of others. The dependencies just provide files which can be loaded in the apps/extensions that depend upon them ; they cannot have any permissions or features (like a background page) of their own. At install time for an app/extension that depends on shared modules, we will automatically download and install any missing dependencies from the chrome webstore, as well as remove them later if you uninstall all apps/extensions that depend on them.
Right now the feature is only available in the dev channel of chrome, but we intend to fully support it when we've had a chance to get developer feedback. If you want to try it out, grab a copy of the chrome dev channel (or canary).
In the manifest.json for an extension that is just a bundle of files you want to share:
{
...
"export": {
"resources": [ "foo.js", "bar.js" ]
},
...
}
In the manifest.json for an extension/app that wants to depend on the above:
{
...
"import": [{"id": "<id of dependency goes here>"}],
...
}
See the test data files in this codereview for more examples:
https://codereview.chromium.org/13971005
(Sorry we don't have good documentation for this yet ; we will eventually)
Give it a try and send some feedback to extensions-dev#chromium.org or apps-dev#chromium.org.
Normally such are done via package manager. You can use bower to add some local git dependency. Or if you simply want to copy files to your folder every time the target files have changed you can use Grunt task runner with Watch task.
When I said "writing a preprocessor" I overstated what would need to be done. jusio's solution I guess works, but I did something even simpler, with this script run from inside BBEdit:
#! /bin/sh
cd /Users/marc/Documents/Dropbox/dev/chrome
rsync -vrt lib NoteTree
open '/Applications/Chrome Apps.localized/Default nnlinebecgjceggljgcnfploamgnjjhl.app'
This copies the changed files and then invokes the Chrome App. If it's already running, which it usually is during development, I just right-click and choose Reload App. It's a very quick edit-and-test cycle.
(Explanation: /Users/marc/Documents/Dropbox/dev/chrome is the parent folder for my development, subfolder lib contains the common files, and NoteTree is the app I'm currently working on.)