Referencing Tailwind config variables (CommonJS) on the client - javascript

Tailwind's documentation states that you can reference variables inside your Javascript code via the resolveScript plugin. So, for example, the code on the client would look something like this:
import resolveConfig from 'tailwindcss/resolveConfig'
import tailwindConfig from './tailwind.config.js'
const fullConfig = resolveConfig(tailwindConfig)
fullConfig.theme.width[4] // => '1rem'
This answer explains how this can be done with Vite (I'm using Webpack). We're processing the Tailwind via the PostCSS Webpack loader, like this:
{
loader: 'postcss-loader',
options: {
postcssOptions: {
plugins: [
require('tailwindcss')
]
}
}
}
This plugin requires that you provide the path to your Tailwind configuration file to import it on the client-side. Still, my tailwind configuration file lives outside my source folder for the client and is written in CommonJS syntax (e.g., I'm exporting it via module.exports). Therefore, attempts to import it as an EcmaScript module will not succeed (we are exporting our configuration using module.exports rather than export default in other words).
Does anyone know the fix for this? For example, is it possible to load a CommonJS tailwind file into the client, or would I have to convert the entirety of my build step process to ESM?

Related

How can I change in Rollup.js an import module of a dependency package to a local file replacing that module?

I have a JavaScript project that must be bundled using Rollup.js which depends on a package A which in turn depends on a package B:
"mypackage" ---import--> "A" ----import----> "B"
Let's say that my package import a function "connect" from package A, which in turn import a "connect" function exported by the module B. Something like:
//mypackage index.js
import { connect } from 'A'
//A index.js
import { connect } from 'B'
//B index.js
export function connect() {}
Since my package requires a bundled version of the package B (let's say "B.bundle.js"), how can i configure Rollup.js in order to replace for each dependency of my project requiring B (A in this case) to use my local bundled version (i.e. B.bundle.js, which of course exports the "connect" function too)?
When Rollup.js creates the bundled version of my project i would like to achieve something like the following:
//A index.js after being processed by Rollup
import { connect } from './src/B.bundle.js'
Is something like this possible with Rollup or with a plugin? Sorry for the question, but I'm new to rollup and bundling in general.
I solved this issue using some configuration of my package package.json and the rollup plugin #rollup/plugin-node-resolve.
In the package.json of my package I inserted the browser option that specifies how modules should be resolved when my package is used in the browser context. From the npm doc on the browser option of the package.json:
If your module is meant to be used client-side the browser field should be used instead of the main field. This is helpful to hint users that it might rely on primitives that aren't available in Node.js modules. (e.g. window)
So considering the example provided in the original question the npm package contains something like this:
{
"name": "mypackage",
"version": "1.5.1",
"description": "A brand new package",
"main": "index.js",
"browser": {
"B": "./B.bundle.js"
},
}
This means that when mypackage is used in the context of the browser the module B import will load from the file located in "./B.bundle.js".
Now, with rollup i need to specify that the bundle i am creating is intended for browser context. The plugin that handle imports of node modules is #rollup/plugin-node-resolve. There is an option is this plugin that specify that the context is browser. From the plugin documentation about the option browser:
If true, instructs the plugin to use the browser module resolutions in package.json and adds 'browser' to exportConditions if it is not present so browser conditionals in exports are applied. If false, any browser properties in package files will be ignored. Alternatively, a value of 'browser' can be added to both the mainFields and exportConditions options, however this option takes precedence over mainFields.
So that in my rollup config file i have something like:
// rollup.config.js
import commonjs from "#rollup/plugin-commonjs";
import resolve from "#rollup/plugin-node-resolve";
import nodePolyfills from "rollup-plugin-node-polyfills";
export default {
input: "index.js",
output: {
file: "dist/mypackage.bundle.js",
format: "es",
},
plugins: [
nodePolyfills(),
resolve({
browser: true, //<- tells to rollup to use browser module resolution
}),
commonjs(),
],
},
The answer of #vrajpaljhala seems feasable for the purpose but imho #rollup/plugin-replace its kind of too tricky and raw approach to the problem because it involves direct replacmeent of strings surrounded by "". We may face some very "hard to discover" errors if the package name we are going to replace is a common word that can be present as a string in the code but not in the context of an import statement.
We had the exact same requirement and we resolved it using #rollup/plugin-replace package.
Basically, our project uses a monorepo structure but we are not using any tools like learna or workspace for managing it. So we have two packages which are interdependent. Our ui-kit uses icons package for different icons and icons package uses the Icon component from ui-kit package. We just replaced the imports for ui-kit with local path with the following:
import replace from '#rollup/plugin-replace';
...
...
export default {
...
...
plugins: [
replace({
'ui-kit': JSON.stringify('../../../../src/components/Icon'),
delimiters: ['"', '"'],
}),
]
...
...
}

TypeScript module system config vs Webpack library type

I'm working on my first TypeScript library (it's actually even my first JavaScript library) which is used in the front-end. In essence, it should expose a function which receives a DOM element and adds another DOM element to it as a child.
I would like to use Webpack to bundle the library and during configuration of it and TypeScript I stumbled across module systems. And they are confusing me.
In tsconfig.json I can define which module system should be used in the compiled code, if I understand correctly:
{
"compilerOptions": {
"module": "es6"
}
}
And in the webpack.config.js I am able to set a desired target for my library using output.library.type, where I can again specify a module system:
module.exports = {
output: {
library: {
name: 'my-lib',
type: 'umd',
}
},
I only need my library to be installable via npm/yarn:
$ yarn add my-lib
And consumable via an an import statement like that:
import { myFunc } from 'my-lib';
So far so good, with these settings it seems to do what I want. But I don't understand what I am doing here. Hence the questions: What is the difference between the two module system configuration options (the one in the TypeScript config and the one in the Webpack config)? And what settings are appropriate for my use case?

What's the recommended way to import libraries inside webpack.config.js?

I always used require call inside webpack.config.js to import libraries for my build workflow. However, recently I worked on a project where import is used.
Generally, it works fine but seems like some supporting tools that import webpack.config.js cannot import it when import is used inside of it with error:
Cannot use import statement outside a module.
As a tryout, I changed the config to run through babel and it worked, though for me looks like I am fine to use just good old require and official documentation still uses it.
Should I rollback import to avoid the caveats or I can use them without problem if I pass it through babel and renaming to webpack.config.babel.js?
After some research, I got to a solution - transpile webpack config before passing it to a library I used which failed to load it (eslint-import-resolver-webpack).
So, I don't load webpack.config.js directly, but via babel transpiler:
const webpackConfigTranspiled = babel.transformFileSync(
path.resolve(__dirname, './webpack.config.js'),
).code;
Then pass the transpiled content to the library/tool:
settings: {
'import/resolver': {
webpack: webpackConfigTranspiled,
},
'css-modules': {
basePath: 'src',
},
}

Dynamic loading of external modules in webpack fails

I am trying to set up the following architecture: a core React application that gets built with some basic functionality, and the ability to load additional React components at runtime. These additional React components can be loaded on-demand, and they are not available at build time for the core application (so they cannot be included in the bundles for the core application, and must be built separately). After researching for some time, I came across Webpack Externals, which seemed like a good fit. I am now building my modules separately using the following webpack.config.js:
const path = require('path');
const fs = require('fs');
process.env.BABEL_ENV = 'production';
process.env.NODE_ENV = 'production';
const appDirectory = fs.realpathSync(process.cwd());
const resolveApp = relativePath => path.resolve(appDirectory, relativePath);
module.exports = {
entry: './src/MyModule.jsx',
output: {
path: path.resolve(__dirname, 'dist'),
filename: 'MyModule.js',
library: 'MyModule',
libraryTarget: 'umd'
},
externals: {
"react": "react",
"semantic-ui-react": "semantic-ui-react"
},
module: {
rules: [
{
test: /\.(js|jsx|mjs)$/,
include: resolveApp('src'),
loader: require.resolve('babel-loader'),
options: {
compact: true,
},
}
]
},
resolve: {
extensions: ['.wasm', '.mjs', '.js', '.json', '.jsx']
}
};
Took a look at the generated MyModule.js file, and it looks correct to me.
Now, in my core app, I am importing the module as follows:
let myComponent = React.lazy(() => import(componentName + '.js'));
where componentName is the variable that matches the name of my module, in this case, "MyModule" The name is not known at build time, and the file is not present in the src folder at build time. To avoid errors from webpack when building this code with an unknown import, I have added the following to my webpack.config.js for the core project:
module.exports = {
externals: function (context, request, callback/*(err, result)*/) {
if (request === './MyModule.js') {
callback(null, "umd " + request);
} else {
callback();
}
}
}
I have confirmed that the function in externals gets called during the build, and the if condition is matched for this module. The build succeeds, and I am able to run my core application.
Then, to test dynamic loading, I drop MyModule.js into the static/js folder where the bundles for my core app live, then I navigate to the page in my core app that requests MyModule via let myComponent = React.lazy(() => import(componentName + '.js'));
I see a runtime error in the console on the import line,
TypeError: undefined is not a function
at Array.map (<anonymous>)
at webpackAsyncContext
My guess is it's failing to find the module. I don't understand where it is looking for the module, or how to get more information to debug.
Turns out that I was making a couple of incorrect assumptions about webpack and dynamic loading.
I was having issues with two things - the kind of module I was loading, and the way that I was loading it.
Dynamic importing is not yet a standard ES feature - it is due to be standardized in ES 2020. This dynamic import will only return a module if the module object you are attempting to load is an ES6 module (aka something that contains an 'export ModuleName'). If you attempt to load something packed up as a CommonJS module, AMD, UMD, the import will succeed, but you will get an empty object. Webpack does not appear to support creating bundles in ES6 format - it can create a variety of module types, and in my config file above, I was actually creating UMD modules (configured via libraryTarget setting).
I had issues with the import statement itself because I was using it within an app bundled by Webpack. Webpack reinterprets the standard ES import statement. Within a standard webpack config (including the one you get from CRA), webpack uses this statement as a split point for bundles, so even modules that are dynamically imported are expected to be there at webpack build time (and the build process will fail if they are not available). I had tried to use webpack externals to tell webpack to load the modules dynamically, which allowed the build to succeed without the modules being there. However, the app still used Webpack's import function instead of the standard JS import function at runtime. I confirmed this by attempting to run import('modulename') from the browser console and getting a different result than my app, which was bundled with webpack.
To solve problem #2, you can tell Webpack to not reinterpret the ES dynamic import by adding some annotation to the import statement.
import(/*webpackIgnore: true*/ 'path/to/module.js');
This will both prevent Webpack from attempting to find and bundle the dynamically imported module at build time, and attempting to import it at runtime. This will make behavior in the app match behavior in the browser console.
Problem #1 was a bit more difficult to solve. As I mentioned above, importing a non-ES6 module will return an empty object (if you await the promise or use .then()). However, as it turns out, the file itself does load and the code gets executed. You can export the module in the "window" format using Webpack, and then load it as follows.
await import(/*webpackIgnore: true*/`path/to/module.js`);
let myModule = window['module'].default;
Another potential solution that avoids using the window object is building the module using a system capable of producing ES6 modules (so, not Webpack). I ended up using Rollup to create an ES6 module that pulled all dependencies into a single file, and ran the output through Babel. This produced a module that loaded successfully via a dynamic ES import. The following was my rollup.config.js (note that I included all external node modules needed in my module - this bloated the module size but is a requirement for my specific application - yours will likely differ and you will need to configure rollup to exclude the modules)
// node-resolve will resolve all the node dependencies
import commonjs from 'rollup-plugin-commonjs';
import resolve from 'rollup-plugin-node-resolve';
import babel from 'rollup-plugin-babel';
import replace from 'rollup-plugin-replace';
export default {
input: 'src/myModule.jsx',
output: {
file: 'dist/bundle.js',
format: 'esm'
},
plugins: [
resolve(),
babel({
exclude: 'node_modules/**'
}),
commonjs({
include: 'node_modules/**',
namedExports: {
'node_modules/react/index.js': ['Children', 'Component', 'PropTypes', 'PureComponent', 'React', 'createElement', 'createRef', 'isValidElement', 'cloneElement', 'Fragment'],
'node_modules/react-dom/index.js': ['render', 'createElement', 'findDOMNode', 'createPortal'],
'node_modules/react-is/index.js': ['isForwardRef']
}
}),
replace({
'process.env.NODE_ENV': JSON.stringify( 'production' )
})
]
}

Adding typescript import prevents compiling to a single js file

I am use grunt-typescript to generate a single js file from a set of ts files. This works fine until I add an import statement to one of the ts files.
Example grunt-typescript config
typescript: {
server: {
src: ["./ts/file1.ts", "./ts/file2.ts"],
dest: "./js/out.js",
options: {
module: 'amd', //or commonjs
target: 'es5', //or es3
basePath: '',
sourcemap: false,
declaration: false,
ignoreError: false
}
}
}
If I add an import statement to the top of file2.ts e.g.
import PG = require("pg");
Then I get errors that code in File1.ts can't find types defined in File2.ts and I get an unexpected File2.js generated in the /ts directory ignoring the dest file parameter. The import seems to cause it to compile File2.ts totally separately.
Is this to be expected with import or how can I fix this to create the expected single js file without compile errors?
As soon as you import an AMD module, or export from outside of any internal module, your file will be compiled as an AMD module. AMD and single-file compilation are inherently different modes of working and don't like to be mixed. To read up on internal vs external modules check out this TypeScript wiki page.
You technically can still import AMD modules using the standard JavaScript method, but it's awkward. For example, using the require.d.ts file from DefinitelyTyped:
/// <reference path="require.d.ts"/>
require(["somemodule"], (SomeModule: namespace.SomeModule) => {
// async code called after the module is retrieved
});
Without the import keyword TypeScript does nothing about the require and leaves you on your own.
Alternately I would recommend going full AMD. If any of your libraries are AMD it's easier to work with, and you can still compile down to a single file when it's time to release.

Categories

Resources