Keep Flow types post webpack build - javascript

I want to add Flow to my current project I am working on. Everything works really great. However, I couldn't find a way of someway keep the types post build. I am using a monorepo structure and I have a lot of NPM modules. I would like to get an error if a module changes it's interface or it's exported functions/classes/types changes.
Any ideas/guidance is highly welcomed!
Thanks!

Webpack bundles JS files into a single output file, it has no way do preserve Flow types in the output bundle.
If you want to preserve Flow logic for use alongside this bundle, the current best practice would be to include your original sourcecode as .js.flow files. This blog post elaborates on this approach, but the short version is, you'd use flow-copy-source to output a bunch of .js.flow files that match your original source code.

If you insist on others including your compiled bundle instead of the source files, you'll need to include a .js.flow file that provides all of the external type interfaces. Here's the interface file for Immutable.js as an example.
Unless your library has some sort of build complexity that requires the distribution of its compiled assets, I would just rely on the consumers of your lib to compile and strip types on their own.

Related

Typescript in `file://` environment

Recently I just picked up Typescript for a personal project. Since the project is designed to be ran locally (explicitly file://), I won't be able to use import/export features due to CORS restrictions. Aware of another similarly written question but lacking the specific context on my use case, I pose these questions:
How does one tell Typescript that all (or at least certain) scripts are imported to HTML via <script type="text/javascript" src="./source.js">?
Does Typescript's tsc build projects with this in mind? Also, does it edit existing HTML files to take this into consideration too? If not, are there tools to automate this process as well?
I don't want to bundle them like webpack or tsc-bundle does, since a secondary objective to this project is to keep all .js files human-readable just as much as the .ts files do.
Building Typescript using tsc -p tsconfig.json, configured to "target" : "ES2015" and "module" : "None", only outputs their respective .js files and doesn't update any of the HTML's <script> includes. Am currently maintaining the html file by manually inserting and juggling any new modules that emerges over the course of development.
My current load order in index.html is as follows:
index.js handles UI controls and loads first.
The remaining pseudo-modules .js files loads in-between, since these only define classes and doesn't perform any operations, so I figured it's safe to load them here.
main.js handles all the code from javascript "modules" and loads last.
My main concern is that my in-between modules might load out of order due to human error.
Edit: Running a local webserver is out of the question too, since the project is meant to target audience with limited technical knowledge, with the index.html file the only file they need to run in their browser.

Webpack Library Output

How does Webpack know which files to include in library build? How does it know which files should not or should be included, as in miscellaneous files like images, examples, documentation, etc. If it automatically includes them how do we make Webpack ignore these included files?
Webpack scans the actual JS files themselves, starting at your entry point(s) and recursively scanning each referenced file, to determine what to build. It won't include any other files like examples or documentation unless you for some reason are include/requiring them from your javascript.
Things like CSS/LESS/SASS and images are built with specific loaders which generally also only build referenced files.
TL;DR: If it isn't explicitly included somewhere, it probably isn't in the build.

Haxe -> Javascript target for CommonJs (NodeJs) style output

Haxe's JavaScript exports everything in a Haxe compilation into a single output file. This is great for building applications. For a general purpose library, however, it would be nice if it output a *.js file per *.hx file within my compiled sources.
The intent of this is so that I can create a NodeJs module where the consumer of the library only needs to require() the particular files that they would like to use. Is this currently possible using the Haxe compiler on its own, or via an external tool?
There is the hxgenjs library that can generate one js file per haxe class/enum.
https://github.com/kevinresol/hxgenjs
I see 2 different questions here
How to output Haxe module as a NodeJS module?
How to build each JS file into separated output file?
As for #1 there is #:expose directive and it should help.
As for #2 you can use --each and --next in your build *.hxml file. This way you can specify several targets at once(and they will be built at once too). Unfortunately there is no way to use the mask so you will have to list all your entry points(modules' roots) manually.

What is the best practice to consolidate multiple Javascript files?

For CSS, I can use SASS to import one CSS file to another and produce only single CSS file. What is the similar method for Javascript files?
You might want to check out Closure Compiler (which is a Google product).
You would probably want the Closure Compiler Application form of the product.
A sample workflow would probably look like:
Create a list of your JS files and paths
Run the command to compile and concatenate files (java --jar compiler.js --js path_to_file1.js --js path_to_file2.js (etc.) compiled.js)
Closure Compiler also has a related project, Closure Stylesheets, that does the same thing for stylesheets.
This approach, of course means that there's a pre-compilation step. Depending on your backend, there also exist libraries that do the compilation when the page is built. For example, for JSP, there's Granule a tag library that creates the compiled JS and CSS files at page build.
There's a third possibility: modularization. Since you gave the example of being able to import CSS files in SASS, an analogue for JavaScript is using a module library, using either the CommonJS standard, or (the one I prefer), the AMD (asynchronous module definition) pattern, which I have personally used with RequireJS. RequireJS also comes with a nice optimizing tool that will bundle up (minify, compress, concat etc) all the required files for your application
UPDATE
Since you mentioned that you are using Django in the comments (might be useful to update the question with this info too), see if this answer helps too
You could use minify which allows you to minify and combine javascript files. It also works with CSS.

How can I convert a multi-file node.js app to a single file?

If I have a node.js application that is filled with many require statements, how can I compile this into a single .js file? I'd have to manually resolve the require statements and ensure that the classes are loaded in the correct order. Is there some tool that does this?
Let me clarify.
The code that is being run on node.js is not node specific. The only thing I'm doing that doesn't have a direct browser equivalent is using require, which is why I'm asking. It is not using any of the node libraries.
You can use webpack with target: 'node', it will inline all required modules and export everything as a single, standalone, one file, nodejs module
https://webpack.js.org/configuration/target/#root
2021 edit: There are now other solutions you could investigate, examples.
Namely:
https://esbuild.github.io
https://github.com/huozhi/bunchee
Try below:
npm i -g #vercel/ncc
ncc build app.ts -o dist
see detail here https://stackoverflow.com/a/65317389/1979406
If you want to send common code to the browser I would personally recommend something like brequire or requireJS which can "compile" your nodeJS source into asynchronously loading code whilst maintaining the order.
For an actual compiler into a single file you might get away with one for requireJS but I would not trust it with large projects with high complexity and edge-cases.
It shouldn't be too hard to write a file like package.json that npm uses to state in which order the files should occur in your packaging. This way it's your responsibility to make sure everything is compacted in the correct order, you can then write a simplistic node application to reads your package.json file and uses file IO to create your compiled script.
Automatically generating the order in which files should be packaged requires building up a dependency tree and doing lots of file parsing. It should be possible but it will probably crash on circular dependencies. I don't know of any libraries out there to do this for you.
Do NOT use requireJS if you value your sanity. I've seen it used in a largish project and it was an absolute disaster ... maybe the worst technical choice made at that company. RequireJS is designed to run in-browser and to asynchronously and recursively load JS dependencies. That is a TERRIBLE idea. Browsers suck at loading lots and lots of little files over the network; every single doc on web performance will tell you this. So you'll very very quickly end up needing a solution to smash your JS files together ... at which point, what's the point of having an in-browser dependency resolution mechanism? And even though your production site will be smashed into a single JS file, with requireJS, your code must constantly assume that any dependency might or might not be loaded yet; in a complex project, this leads to thousands of async load barriers wrapping every interaction point between modules. At my last company, we had some places where the closure stack was 12+ levels deep. All that "if loaded yet" logic makes your code more complex and harder to work with. It also bloats the code increasing the number of bytes sent to the client. Plus, the client has to load the requireJS library itself, which burns another 14.4k. The size alone should tell you something about the level of feature creep in the requireJS project. For comparison, the entire underscore.js toolkit is only 4k.
What you want is a compile-time step for smashing JS together, not a heavyweight framework that will run in the browser....
You should check out https://github.com/substack/node-browserify
Browserify does exactly what you are asking for .... combines multiple NPM modules into a single JS file for distribution to the browser. The consolidated code is functionally identical to the original code, and the overhead is low (approx 4k + 140 bytes per additional file, including the "require('file')" line). If you are picky, you can cut out most of that 4k, which provides wrappers to emulate common node.js globals in the browser (eg "process.nextTick()").

Categories

Resources