Coffeescript and es6 in one project - migration in practise - javascript

I have quite huge project (production bundle file is around 400kB) written in coffee script. And I have no idea how to plan migration to ES6. I know that there are tools like Decaffeinate but I am not sure if it really works in bussiness practise.
I suppose that I can use ES6 and coffee in one project but is it possible to write components in coffee which import and use code written in ES6 and vice versa and all works in production?
Is this migration possible to be done step by step or there is no other option than doing everything in one release?
How does webpack work (proper loader)? What is the sequence? Does it firstly convert ES6 to JS (or coffee to JS) and then do all imports or import files first and then convert to JS?
Finally are there some best practices to have code written in both coffee script and ES6 in similar situation?

I'm the main person working on decaffeinate recently. It has been used on large production codebases and currently has no known bugs, so it's likely stable enough for you to use it. Still, you should convert your codebase a small piece at a time rather than all at once.
You can configure webpack to allow both CoffeeScript and JavaScript in the same project by specifying coffee-loader for .coffee files and babel-loader for .js files in your module.rules (or module.loaders for webpack 1).
If you use require syntax, importing code between CoffeeScript and JavaScript should just work without any problems. If you use JS import/export syntax, you may need to use require('./foo').default in some cases when requiring JavaScript from within a CoffeeScript file. But in most cases interop should just work even with import/export syntax.
Probably a good idea is to convert one or two files and make sure eslint, babel, and other JavaScript config are configured they way you want them to. From there, you should convert the code one or two files at a time, and increase the number of files you convert at once as you get more comfortable with it and depending on what your needs are.
One approach is to convert CoffeeScript files to JS slowly over time, converting and cleaning up the ones you touch. The problem I've seen with this approach is that it can be a very long time before you move off of CoffeeScript. It depends on your situation, but generally I would recommend running decaffeinate in larger and larger batches without focusing too much on manual cleanup, then once the codebase is 100% JavaScript, manually clean up files as you work with them.
A while ago, I wrote up some thoughts on how to think about different conversion strategies, which you might find useful:
https://github.com/decaffeinate/decaffeinate/blob/master/docs/conversion-guide.md#converting-a-whole-project

Related

Import, Require? How to Mash Javascript Files Together?

This is vague - I apologize in advance, I am trying to be as succinct as I can with my limited understanding, while exposing my potentially incorrect assumptions.
I have a website that's literally one huge HTML file. It runs scripts defined in-line in a <scripts> tag.
My goal is to move all the scripts into individual .js files and pull them into index.html (or into one another where required). I am familiar with the usage of Node's require and I am familiar with import, as used in Angular. I stress usage because I don't really know how they work.
Assumption 1: I cannot use require - it is purely for Node.js. The reason I bring it up is that I am pretty sure I have seen require in AngularJS 1.5 code, so assuming makes me uncomfortable. I am guessing this was stitched together by WebPack, Gulp, or similar.
Assumption 2: I should use import, but import only works with a URL address, if I have my .js hosted on a server or CDN, it will be be pulled. BUT, I cannot give local pathing (on the server) here - index.html will NOT automatically pull in the dependencies while being served. I need npm/Webpack/other to pre-compile my index.html if I want the deps pulled in on the server.
Assumption 3: Pre-compiling into a single, ready-to-go, html file is the desired way to build things, because the file can be served quickly (assuming it's ready to go). I make the assumption with the recent trend of serving Markdown from CDNs, the appearance of the JAMstack, and the number of sites using Jekyll and such (though admittedly for traditional Jekyll sites).
Assumption 4: I also want to go to Typescript eventually, but I assume that changes nothing, since I will just pull in TS to compile it down to .js and then use whatever solution I used above
Question: If it's clear what I am trying to do and what confuses me, is a decent solution to look into npm/Webpack to stich together my .js files? What would prepare them for being stiched together, using import/export? If so, is there an example of how this is usually done?
As you mentioned, require cannot be used for your purposes, since it is a part of CommonJS and NodeJS's module system. More info on require can be found here: What is this Javascript "require"?
Import is a ES2015 standard JS syntax. But ES2015 standard and everything above it has very limited browser support. You can read more about it here: Import Reference
However, you can still code in the latest standard (thereby enabling the use of import/export etc.,) and then transpile the code to be able to run on the browser. Inorder to do this, you require a transpiler. You can refer Babel which is one of the most popular transpilers : https://babeljs.io/
For your exact purpose, you need to use a module bundler. Webpack/Rollup etc., are some popular module bundlers. They can automatically identify all the JS files referenced through import, combine them and then transpile code to be able to run on the browser (they also use a transpiler) and produce a single JS file (or multiple, based on your configurations).
You can refer the getting started guides of Webpack: https://webpack.js.org/guides/getting-started/
or RollupJS: https://rollupjs.org/guide/en#quick-start

Using .js file extensions whilst writing scripts that return jsx

I started using react as my ui framework of choice. Upon my adventure through the documentation I noticed that when using the create-react-app script to spin up a new react boilerplate, they used .js file extension on scripts that where returning jsx code. When I asked my buddy he told me that you should use the .jsx extension on scripts that are returning this kind of code.
Im a bit lost here as if both works, wouldn't it just be better to go with the .js extension as at the end of the day its javascript we writing.
Please help me with what is considered best practise. (I kinda have a feeling it would be to use the .jsx extensions on these types of scripts, but I'd love to hear the community's view on this)
Thanks in advance :)
It's really up to personal preference.
My personal opinion is JavaScript is JavaScript, and I just use .js. Others will prefer to use .jsx in files that are JSX and .js in for things like utilities, in order to differentiate. The only important thing is to be consistent in whatever you choose (at least within one project).
In general, it doesn't matter.
The only time it might actually matter is based on your build pipeline. Some build pipelines may be configured to compile .js and .jsx with slightly different rules, but that would be based on your application (things like create-react-app don't care).
At the end of the day, you could use a .cookiemonster extension and it'd work just fine (as long as your build pipeline is configured to handle it).
Actually it doesn't matter, is up to you to decide I prefer to use .jsx when I return the mix between HTML and JS and use .js only when I'm using plain JS or ES6.
I recommend you to read about this on this issue on github.

Webpack multi ES version builds

Hello you smart people of the Stackoverflow. I have a challenge on a webpack build I am trying to set up.
What I am trying to acheive
I am trying to create a build that generates two version of my JS files. One that uses ES 6/7+ which will be used by the latest browsers and one that is transpiled back to ES 5.
The reason is that I want to minimize code sent to the "decent" browsers and to leverage any performance optimizations that can be drawn from using pure ES 6/7, not to mention adding less polyfills to the code.
The problem
Currently I have achieved my goal by running two parallel builds; one that transpiles the code with only few Babel transforms for a ES 6/7+ version and a second that uses the es2015 preset to transpile back to ES 5 and this generally works as intended. However my problem is that the full build that transpiles both versions is SUUUUUPER slow. We are talking 5+ min slow which is far from ideal.
One of the problem I see right away with this setup is that there is a lot of code that is being reparsed that might not need reparsing, such as CSS (stylus), images, SVGs, fonts etc. All of these assets really only need to be treated once, as it will be exactly the same for each version of the JS files. So in reality I only need to transform the JS part of the code, which is why I created the rebabel-webpack-plugin module, which really just takes the emitted files, transpiles them again and save the code under another name. This works fine, but it feels hacky and it kinda circumvents the original webpack build flow with some possible drawbacks as a consequence:
I don't really think that SourceMaps will be working properly, as it is not transpiling the original code.
Adding polyfills with a babel-preset-env babel-polyfill combo could be challenging and/or hacky
Bundle analyzers could be messed up
Unable to do a shared code file for regenerated files
I have not verified these drawbacks, but they are what I can see of immediate possible issues.
I have tried running a subsequent build on the emitted files, which works, but generally it is kinda the same as using the rebabel-webpack-plugin, but with a more awkward setup (collecting list of emitted files -> setting up new config -> run new build).
The actual question
So my question for you bright minds out there boils down to: How do one achieve a proper "re-targeting" of our JS files using webpack, without having to run two parallel builds?

How do I build and validate a plain JavaScript-based code base?

My front end is an Angular 1.x project with tons of files. I basically need to validate it and find any errors that are there in any of the files. Specifically, errors that can break the page. In compiled/static type languages like Java, this is very easy, as the compiler will tell you exactly what's wrong. However, since JS is interpreted/dynamically typed, I can't figure out a way to "build" these files and find errors like I would for compiled languages. Going to every single page in the browser after I make any change is neither practical nor scalable.
I am also not using TypeScript or ES6 and it's not possible at the moment to migrate to any of them. Tools like ESLint and JSHint have also not been very successful, since they only bring out minor errors within that file. However, a lot of major code is spread over several files. Although my code is already all ES5, I thought about concatenating all JS files together in one file and running babel on it. But have it been sure how to manage dependencies during the concatenation (such as in what order to concatenate files).
This cant be the only project that uses vanilla JS and needs to be validated for errors. Anyone has any ideas on how I should go about accomplishing the task?
I highly recommend writing tests using jasmine and karma. I've found the two of these integrate really well with Angular and test driven development is highly regarded as one of the best development styles.
With all of this being said, I understand that's not what you're looking for directly because you want more of a "compiler" like solution. The closest thing that you can get to this in JS in my opinion is a linter and when combined with tests, this solution is rather good at finding errors in JS code.

How to write modular client-side Javascript?

I am in the process of writing a heavy Javascript app, which will ultimately be used by injecting one script into clients websites.
As of now I am writing all the modules in one JS file, however I am quickly finding that to be ineffective, as it feels very messy and cluttered, and I feel like the modules should all be in separate files.
My question is, what is a good approach to managing this. Should I write all the apps modules in separate files, and than compile them into one on the Server?
If it matters, I am using Node.js for my server.
First point : don't try to code everything in one file. Most big javascript applications contain dozens of files.
Use some kind of makefile to concatenate your js (and css) files. And after that use a minifier (I use Google Closure Compiler). To help debug, my deployement scripts always make two versions in parallel : one non concatenated/minified and one concatenated/minified. The uncompressed version enables the development/test onsite without any deployement operation.
This means that, as for all big application development, you need some kind of deployement toolchain to orchestrate the operations. This may be based on shell scripts, maven, ant, etc.
Secondly : use classes (without abuse, javascript isn't really OOP) and namespaces to clearly isolate your functions.
Yes, keep all your files logically separate and then minify and combine them as a publish step or on the fly when serving them. Scott Hansleman wrote a very good blog post on why you should do this here.

Categories

Resources