How to keep temporary files between multiple runs in nodejs? - javascript

I have a small node module which generates files.
As it is really slow and will produce the same result for the same input I would like to keep the last compilation together with a control hash.
The question is now where do I have to place the temporary files so it can be easily accessed?
The cache should also work if the main node application which depends on my module restarts.

I'll collect all my comments into an answer.
As far as I know, there is no NPM standard for where a module would put it's tempfiles. The best place to put them can depend upon how file permissions are configured, what operating system you're running, what permissions the host app is running under, the type of hosting environment, etc...
The logical options are as follows:
In the OS temp directory
In a temp sub-directory below the module directory.
In a configurable directory that the user of the module can specify either via a config argument or via an environment variable.
You can find out where the OS temp directory is with os.tmpdir().
A temp sub-directory below the module. Keep in mind that there can be multiple processes using a module so if you're putting files in a location that may be shared by multiple processes, then you need to be using generated unique names if the files are supposed to be separate per process or appropriate locking if the files are supposed to be shared among processes.
And, don't forget about cleanup maintenance so there's no build-up of temporary files over time.

Related

When does the Webpack bundle occur?

I am learning ES6 modules at this current time. It took me so long to finally understand and be proficient in closures, IIFEs and scope using one script that I was almost upset to find ES6 modules bring about a different, more manageable way to organise modular code into various different scripts and then for a bundler like Webpack to bundle it all back into one (or only a few) scripts.
I get the normal cross origin error when I put script type = ‘module’ and try to run modules on my local file system which is different to when I run a normal script just simply specifying a src!
Wherever I look the solution is to use a local host to get round this which I have done! But at what point does Webpack work its bundling? Is it when I run it in the command line or when it’s loaded into the browser?
If installing Webpack via npm in my project and setting up the configuration, does this mean I wouldn’t have to use a local host because my distribution code during the runtime is now in one script file, so it doesn’t have to import scripts not on the same URL?
I know it’s only on my local file system, but I cannot request scripts in the same folder when using ES6 modules due to the cross origin policy as I could if just specifying a script src without using modules.
Each time I run Webpack from the command line, it bundles together the latest code taken from the entry point (specified during my configuration).
This means I can use the ES6 module syntax on my entry file without having to use a live server as I never intend to load this JavaScript entry file into the browser. It’s simply to be the controller of all other script exports by importing what I need from them.
Then this script will be the target of Webpacks bundling (the entry point) meaning the only script that is loaded into the browser is the bundled script, which has no other imports. This avoids the need to run a live server when testing ES6 modules!!
If not using Webpack, I would have to run a server because my main script would be importing other scripts and unless they have the same URL (they don’t even have one yet in my file system) then I will get a cross origin error. As soon as I run my local server 127.0.0.1 then it would work. I prefer testing with Webpack.

How to set up dynamic environment configurations in React

My aim is to find a way to dynamically pull in environment-specific config values for various 'tracks' of my React app: development, staging and production. The solution most often prescribed involves the use of environment variables, which I don't find great because:
Some of my config values are sensitive data like API secret keys, database passwords, etc and I'd ideally not be keeping these in plain-text both locally and on a CICD system
Having to manually set env vars is error prone and doesn't scale well (it's a big project that has more than 20 config-related key-value pairs to set). It's also difficult to document which env vars need to be set, so it's not a convenient solution for a multi-collaborator team as everyone needs to keep track of the list and copy-paste the values into their local machines for shared API keys, etc (or worse, hard-coding/checking them into the source code)
I have tried the following 2 general approaches:
Use node-config - it looks promising as it's light, flexible, and extensible (it allows defining base values on default.js and overriding them with development.js, staging.js, production.js or with custom env variables). Most importantly, we can store secrets in a remote service (e.g AWS/GCP Secrets Manager, envkey, etc). This solution works well for my Node backend, but so far not for the frontend app built on React
Use dotenv (or dotenv-safe, to allow documenting the structure of .env file in another one .env.example that is checked into source control). This is not my favored approach as dotenv discourages using multiple .env files for each environment our project needs. Secondly, I'd likely still have to find another way to feed in the env variables into my CICD system. Redefining the env vars on the [remote] build system feels like doing the work twice - the first being on the .env files used for local development.
Both approaches yield a familiar problem: TypeError: fs.readFileSync is not a function. According to this related question, it appears that the underlying issue is that the 'fs' module is not designed to work on the browser (both dotenv and node-config are low level modules that use 'fs' under the hood). If we cannot use fs (or rather, modules that rely on it) on the client side: how do scalable/production-grade React projects typically manage config values in a sane way? I know hashicorp/vault exists but it seems a bit of an overkill as we'd likely have to set up our own infrastructure.
I also wonder if there's any open-source tools out there to solve this common problem...
Neither of the two solutions offered above really met my requirements, first because I'm using a create-react-app project so don't have much control over webpack configuration. Secondly, I'd much prefer to not keep .env files locally (let alone in plain text)
Luckily, I came across https://doppler.com/, a universal secrets management solution that solves my needs as described on the OP:
it's a cloud-based secrets store + manager that comes with a CLI, which allows us to use the same env secrets across the entire pipeline (local development, CICD and production)
projects come loaded with development, staging and production environments that makes it easy to switch easily between different flavors of the app
Because Doppler works by injecting environment variables into the runtime, I can run it like so, with yarn:
doppler run -- yarn start
For server environments that need to first inject the env vars into a bundled app (e.g the firebase emulator), first do a 'doppler-injected' build:
doppler run -- yarn build
And then run the emulator as usual:
firebase emulators:start
Using separate dotenv for example, .env.dev , .env.qa would be my current approach to a scalable react project. I am assuming you are using webpack and are facing issues in fetching the files using fs, as separately fs is a node server side module.
To resolve this TypeError issue, in your webpack config, you can use fs to access the current working directory,
const fs = require('fs');
const path = require('path');
const reactAppDirectory = fs.realpathSync(process.cwd());
const resolveReactApp = (relativePath) => path.resolve(reactAppDirectory, relativePath);
You need to copy all your assets and build into a spearate public folder have a web.config and resolve its path using the above resolveReactApp('public') in contentBase key of devServer section of your webpack config.
You can pass the environment variables using webpack's DefinePlugin or EnvironmentPlugin,
new webpack.DefinePlugin({
'process.env.NODE_ENV': JSON.stringify(process.env.NODE_ENV),
'process.env.DEBUG': JSON.stringify(process.env.DEBUG)
});
new webpack.EnvironmentPlugin(['NODE_ENV', 'DEV']);
To store secrets, I would suggest you check out docker-secrets and using a containerized architecture for deployment, and further when the team expands, it'll be easier to get the project setup as well. Here's a quick setup intro of how to use docker with react and more info on switching from environment variables to docker-secrets for a better system architecture.

Why are files copied from Asar readonly?

I have an electron app, and when I build it for distribution, the actual app code and build folder are on app.asar file. During the app runtime, I have to copy certain files from the app.asar onto the user's computer, wherever the user chooses, and modify with the code.
The problem is that whenever the file is copied, it becomes readonly, and thus I can not write to it. Any way to handle this?
I'm running into this issue as well, I think the issue is that only some of the fs methods are ported over to work well with asar. According to the docs,
With special patches in Electron, Node APIs like fs.readFile and require treat asar archives as virtual directories, and the files in it as normal files in the filesystem.
Therefore, I think the solution is to manually copy the content of files from asar using fs.readFile and then to dump that into the file you want. I will try this today and hopefully post an update with some code.

How to externalize properties in a Windows Store App

I'm working on a Windows Store App (JavaScript/HTML/CSS) that will be deployed directly to devices in our enterprise.
I want to keep the datasources (urls to Restful web APIs) as part of the configuration rather than built into the app itself so that I can set them during deployment (e.g. to set test urls and prod urls).
More generally I want to store text variables in config that is external to the app and can be pulled in by the app somehow.
I thought I could set some environment variables or something but Windows Store Apps can't read them it seems.
Any ideas?
You could certainly make an HTTP request from the app on startup to retrieve a configuration file, but that of course assumes connectivity which may or may not work in your scenario. For a Store-acquired app, this is really the only choice.
In your scenario, however, you'll be doing side-loading through a Powershell, correct? (This is implied in installing directly to devices.) In that case, the Powershell script is running in full trust and will have access to the file system during the process. This means that the script can easily deploy a configuration file into the app's local appdata folder, which the app then picks up when it runs. The app package should also contain a default configuration file that it copies into that appdata folder if such a file doesn't exist on startup.
The documentation for the add-appxpackage script that does the install is here: https://technet.microsoft.com/en-us/library/hh856048.aspx.
Another option you might be able to use is to build different versions of your packages for test and production deployment. It is possible to configure the build process in Visual Studio to selectively bring in different versions of a file depending on your build target (e.g. Debug or Release). I have a blog that describes this technique on http://www.kraigbrockschmidt.com/2014/02/25/differentiate-debug-release-builds-javascript/. This would allow you to package different versions of a configuration file into the package, which you'd then read from the package install location at runtime or copy to appdata if you wanted to make changes at runtime.
I mention this method for building different packages because it's something that doesn't need you to do anything other than change the build target. It accomplishes what you would do with #ifdef precompiler directives in other languages, which aren't available for JavaScript.

What is a preferred way for installing Karma in Angular/Django project?

I am just starting on integrating AngularJS into my Django project.
After I installed Karma for testing following the tutorial I got bunch of Node.js modules installed in my root project folder.
Should I check all of this files from node_modules folder into my repo? Or should I ignore them with .gitignore?
Are there alternatives to installing Karma to root or is it required?
I have found that you need to install a particular node module in a folder that encompasses all files that will use it. This is most easily accomplished by putting all node modules in the root folder of your website. This is by design of node's creator, though I'm not sure if he wants it that way or just does not want to change it. Either way, there is no way around this.
As for karma, as it is a node module, it needs to be in a folder that includes all files that will use it; therefore, if your entire website uses it, you're better off putting it in the website's root folder.
Of course, as node is open source, you could go in & change this requirement of node modules so they can be installed anywhere, maybe with a pointer from a file that uses it to that node module.
Only you & your team (and your users) can determine if you want to push or ignore your website files, but in general with node_modules, if your users need them, send them. If only your developers need them, either install them individually on all developers' machines or make another branch for development work. Node also has a way to separate development modules from release modules, so you could look into that.

Categories

Resources