Modify config file during release - javascript

Current situation
We have many clients using our client application software. The problem is we need to connect to different REST endpoints. The base URL is always different.
Currently we're using a config.json file which we're manipulating during release. A simple example
config.json
{
"endpoint": "http://localhost/api"
}
During startup of our application we're doing an HTTP call to get this file. For further API calls we're using the endpoint provided by the config.json file.
Desired outcome
What we really want is this becomes part of our applications instead of doing the HTTP call. We're using webpack to build our application.
In our dataservice layer we want to do something as follows:
import config from './config';
// use config.endpoint;
config.js
export default {
endpoint: "http://localhost/api"
};
We can override the confg.js file during build. But since we have many clients (+- 30) we don't want to build for each client. We just want one build and modify the config.js file during release with the correct configuration.
Basically we want webpack to ignore the file during build and copy the file to the output folder + inject it in index.html.
I've done some research and I'm not sure how to solve ths issue. Maybe the initial HTTP call isn't that bad afterall?
Edit: the endpoint is just an example, we have more client specific configuration defined in our client app

Ok, this was easier than expected. I simply added a new entry with the config file.
entry: {
config: "./src/config.ts",
app: "./src/main.ts"
}
In the UglifyJsPlugin I added an exclude for the config file.
new webpack.optimize.UglifyJsPlugin({
compress: {
warnings: false
},
exclude: /(config).+/i,
sourceMap: true
}),
The output is a "readable" config file.

Related

Next.js - best way to serve static JS from a node module's "dist" folder

I'm working with an application that uses Tesseract (OCR) to read text from images.
I would like to take some JS files from node_modules/tesseract.js/dist and make them downloadable in the browser.
I know I can just copy the files to ./public and next.js will serve it statically from there, but then if I update my version of Tesseract, I may need to update those files as well. So maintenance becomes a problem.
My 1st thought is to customize my webpack config to copy the files from node_modules/tesseract.js/dist to ./public/tesseract (or something like that). That would make sure the files get re-copied if I update Tesseract. But I'm not particularly savvy with webpack and haven't figured out how to do that yet.
My 2nd thought was to "proxy" the retrieval of the JS file's content and just make the content available as a "page" in next.js (but this seems super hacktastic).
I feel like this is something that shouldn't be so complicated ... but I've not been able to figure it out myself yet.
Thanks in advance!
Yup agreed, updating your server to serve a node_modules path sounds a bit dangerous.
I personally would just copy over these files with webpack like you mentioned.
Here are the docs on Next.js on how to set up a custom webpack config.
next.config.js
const CopyPlugin = require("copy-webpack-plugin");
module.exports = {
webpack: (config) => {
// append the CopyPlugin to copy the file to your public dir
config.plugins.push(
new CopyPlugin({
patterns: [
{ from: "node_modules/tesseract.js/dist", to: "public/" },
],
}),
)
// Important: return the modified config
return config
}
};
I purposefully didn't include the public/tesseract path, I'm not sure if the CopyPlugin will automatically generate missing directories if they don't exist at build time.

Set up webpack to pull JS file from local rather than via HTTP

webpack.config.js pulls remote js for Module Federation.
plugins: [
new ModuleFederationPlugin({
remotes: {
'mfe1': "mfe1#http://xxxxxxxxxx.com/remoteEntry.js"
}
})
],
How can I use a local JS file in remotes or in addition to remotes? I have a simple react.js library in the other folder, with ./dist/browser/remote-entry.js file in it. I cannot publish to npm, so I'm trying to load it from local. Would it be something like:
plugins: [
new ModuleFederationPlugin({
remotes: {
'mfe1': "../../myproject/dist/browser/remoteEntry.js"
}
})
],
The remotes entry is supposed to be a url that is accessible during run-time, not build-time. If it was only necessary during build-time, it would automatically imply that the remoteEntry gets bundled, which defeats the purpose of Webpack Module Federation (WMF for short).
You say:
webpack.config.js pulls remote js for Module Federation.
But I'm not sure what that is supposed to mean. Webpack does not "pull" the remote files at all. It tells the final build where to look, so that when your code (i.e. bundle.js) actually executes, it knows from where to load modules dynamically.
This means that, in order for WMF to work, you still need to serve the file from your web server.
You primarily have two choices:
If you don't want dynamic loading of modules, just build your project without WMF.
If you do want dynamic loading, then you need to tell webpack that remotes url. Ideally, you can get the actual server address from process.env, which you can provide via dotenv (or through many other means):
webpack.config.js
// ...
module.exports = {
// ...
plugins: [
new ModuleFederationPlugin({
remotes: {
'mfe1': `mfe1#${process.env.REMOTE_HOST || 'http://localhost:8080'}/remoteEntry.js`
}
})
],
// ...
};
.env
REMOTE_HOST=http://xxxxxxxxxx.com
Also, you might want to consider this article on how to deploy a WMF build.

How to import a different bundle for server and client using Rollup in Sapper?

I'm creating a tool which launches a server and fetches content from the server and displays it in the browser. I'm trying to integrate it with frontend frameworks. One of those frameworks is Sapper/Svelte. The problem is that my bundle contains imports to built-in modules which are not needed by the browser, and also not resolved by the browser, which in turn throws an error.
I think what I need to do is make my tool isomorphic and split my tool it into two bundles. One for the server (server.js), and one for the browser (client.js) which doesn't contain the imports to built-in modules. I have a good idea of how I can split the code, using code splitting in Rollup, but what I don't know is how I tell Sapper to use server.js for the server and client.js for the client.
How can I bundle my module so when it's consumed by other applications it knows which one to use for the server and which one to use for the browser? Is this something I can do in my module or do I have to also configure this in the framework it's being used in?
I discovered that #rollup/plugin-node-resolve has a flag to instruct Rollup to use an alternative bundle specified in the browser property in the module's package.json.
As Sapper is configured to create a bundle for both the client and server it has this flag already in it's rollup.config.js.
Sapper
// rollup.config.js
export default {
client: {
// ...
plugins: [
// ...
resolve({
browser: true, // <-- flag
dedupe: [ 'svelte' ],
exclude: [ 'node_modules/**' ]
})
// ...
No changes needed here.
Your NPM Module
You need to create two bundles. One for the server and one for the browser. I felt it was easier to create two different entry points in Rollup for this. It might be possible to use the same entry point and use conditional logic to output a different bundle (something I'm not familiar with).
// rollup.config.js
export default [
{
input: 'src/server.js',
output: {
file: 'dist/server.js',
format: 'cjs'
}
},
{
input: 'src/browser.js',
output: {
file: 'dist/browser.js',
format: 'cjs'
}
}
];
Now we add the path to the browser specific bundle in package.json.
// package.json
{
"main": "dist/server.js"
// ...
"browser": "dist/browser.js"
}
Setting this up means that when Sapper starts it will use a different bundle for the server and the client. When you create the separate bundles you need to structure them so that they work independently of each other. In my case, I isolated all server functionality to the server-specific bundle and excluded all dependencies like http, fs and path from the browser-specific bundle.

Webpack externals help needed

I have created an application (using vue and webpack) that needs two urls to access REST services. These urls are different per deployment. So I would realy like to have an external configuration file (maybe JSON) that stores those urls.
I did some research and I came upon webpack externals. Added externals to my configuration file:
module.exports = {
...
externals: {
tryout: path.resolve(__dirname, "./test.js")
}
}
test.js is really simple --> export default {url1 : 'https://bla', url2: 'https://bla2'}
From the vue file where I need this file I call:
import Tryout from 'tryout'
console.log(Tryout)
This causes my application to show a blank screen. No errors in the console nor in the npm console screen.
Not sure what is happening. couple of questions:
I am using dev server not sure where to place the test.js file I placed it here : __dirname, "./test.js" so this is in my build directory...
Should I load it in my index.html file like an external libary load?
does anybody has a step by step example? Couldn`t find it online..
This should be simple but no clue since I am not getting any error or message...
Any help would be really helpful!

JavaScript Single-Page-App - how do i "distribute" a json?

I'm currently using the Aurelia-Framework and want to load a big json file into my application.
The problem is, I cant figure out how to get the json file to appear in the "dist" folder of my Chrome browser so that the script is able to find it.
In short, I want to do this:
var request = new XMLHttpRequest();
var jsonString = request.open("GET", "file://../core/data/5e-SRD-Monsters.json", false);
...and yes, the path is correct but the folder "data" and its content won't appear in Chrome's debug sources.
Do I have to include the json via gulp somehow?
Your main question is "how to get the json file to appear in the 'dist' folder". As also mentioned in the comments, that is a matter of simply includ.
For a skeleton jspm project do the following:
Open the ~/build/export.js file
Include the file, or folder, containing the .json file in the first 'list' section
This looks something like:
module.exports = {
'list': [
'index.html',
'config.js',
'favicon.ico',
'LICENSE',
"jspm_packages/npm/bluebird#3.4.1/js/browser/bluebird.min.js",
'jspm_packages/system.js',
'jspm_packages/system-polyfills.js',
'jspm_packages/system-csp-production.js',
'styles/styles.css',
'core/data/5e-SRD-Monsters.json'
],
Here is an an example on where to put it.
Important: Considering you're talking about a 'dist' folder I am assuming you use the skeleton with jspm. The process is totally different when you're building an Aurelia app built with CLI, skeleton webpack or perhaps the starter kit.
Now you've got the .json file in the dist folder. But using the XMLHttpRequest to load a file:// isn't exactly the recommended approach. As also mentioned in the comments, ideally you should load it up as a http request, not a file request.
Let's take this into advice. All you need to do is add the aurelia-fetch-client to your library and then you can simply do something like this:
import { HttpClient } from 'aurelia-fetch-client';
export class App {
get_stuff() {
let http = new HttpClient();
// assuming the file is in /dist/core/data/,
// simply read it through a promise + through a relative path:
http.fetch('./core/data/5e-SRD-Monsters.json')
.then(data => console.log(data));
}
}
Now this uses http rather than file://, which will eliminate any permission issues that might occur, such as lack of access to the filesystem directly.

Categories

Resources