Minify JavaScript during GitHub Pages build? - javascript

I have a static website through GitHub Pages, built on Jekyll-Bootstrap. My little website includes a lot of JavaScript, and for maintainability I would like all of the JavaScript to remain human-readable in the GitHub repo.
But for the end-user of my website, I would prefer to minify the JavaScript.
Is there some way to build a hook into the GitHub Pages build process to minify/uglify JavaScript, so that the end user can download smaller files?

The GitHub pages build service cannot have any other code running on it other than Jekyll in safe mode and the small number of included plugins. This is done for security.
Your best option is to use an alternative service to build your site and push the result back to GitHub. The source for the site would reside in the master branch and the compiled source in gh-pages.
A suitable service for doing so would be one of many CI services, such as Travis CI. These are typically used to run software test suites on every push to a repo, but can be used to build your website and push the result back to you.
The Jekyll docs have a guide for testing builds on Travis. Pushing the output isn't mentioned. You'll need a script in the after_success derivative in the Travis conf file. An example from a site I maintain.
To authenticate your push the script will need access to your github personal access token. You can't just put this straight in the deploy script as it's a secret. See the Travis docs on encrypting environment variables.

If you are using Github to generate the site and display it, there is no option to do this because Github is strict about what it will process - for security.
A workaround is to do your compiling and processing locally, then push the resulting output to to gh-pages - which is happy to simply host static pages.
You can still use github to host the project. You just do not use Github to compile it.
Your dev process might be:
Check you're master and local match.
Do your work in dev mode.
Build in production.
Use grunt or other program to minify/uglify/etc the _site production files - outputting to a separate dist (distribution) folder.
Push the contents of the dist folder to your gh-pages.
Commit changes to the project files back to the master.
I am probably not making much sense, but perhaps this discussion might help more: https://gist.github.com/cobyism/4730490
Have fun!

You can try to use my own minifier https://github.com/Mendeo/jekyll-minifier. It is written purely on liquid, so you do not need any additional gems install and it is fully compatible with GitHub Pages.

My approach to this is a Github action that:
checks out the main branch
performs the minification/purging etc
pushes the changes to a gh-pages branch
Then you just need to point Github Pages at the gh-pages branch rather than the main branch.
You'd need to choose appropriate CLI tools to perform the minifying/purging in the virtual machine that the action spins up. There are lots of options here. I'd suggest using packages that can be installed through node so that you only have to install that on the VM. For example:
PurgeCSS: Removes unneeded CSS
terser: Minifies JS
csso-cli: Minifies CSS
html-minifier: Minifies HTML
This is relatively straightforward with a Github action that looks a bit like this:
# A Github Action that minifies html/css/js and pushes it to a new branch
name: purge-and-minify
# Run on pushes to `main` branch
on:
push:
branches:
- 'main'
jobs:
checkout-minify-push:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v3
# Install CLI tools
- uses: actions/setup-node#v3
with:
node-version: '16'
- run: npm install -g terser
- run: npm install -g csso-cli
- run: npm install -g html-minifier
# Use CLI tools to minify, overwriting existing files
- run: for i in ./js/*.js; do terser $i --compress -o $i; done
- run: for i in ./css/*.css; do csso $i -o $i; done
- run: for i in ./html/*.html; do html-minifier [--your-options-here] $i -o $i; done
# Push changes to `gh-pages` branch
- run: |
git config user.name github-username
git config user.email github-username#user.noreply.github.com
git commit -am 'Automated minify of ${{ github.sha }}'
git push --force -u origin main:gh-pages
Here is a working example of a similar process in a Github project of mine.

Netlify is an alternative to GitHub Pages that integrates with GitHub (even private repos) and similarly publishes the output of Jekyll (or other static site generators). There are some limits on the free tier, but most individual users are unlikely to run into them.
You should be able to add one of the Jekyll minifier plugins listed here to accomplish your goals. Here are plugin installation instructions.
Please add a comment if this worked for you! I'd love to hear how it went.

Related

How to create files using JavaScript in GitHub? [duplicate]

I made a website using Node.js as the server. As I know, the node.js file should start working by typing commands in terminal, so I'm not sure if Github Pages supports node.js-hosting. So what should I do?
GitHub pages host only static HTML pages. No server side technology is supported, so Node.js applications won't run on GitHub pages. There are lots of hosting providers, as listed on the Node.js wiki.
App fog seems to be the most economical as it provides free hosting for projects with 2GB of RAM (which is pretty good if you ask me).
As stated here, AppFog removed their free plan for new users.
If you want to host static pages on GitHub, then read this guide. If you plan on using Jekyll, then this guide will be very helpful.
We, the Javascript lovers, don't have to use Ruby (Jekyll or Octopress) to generate static pages in Github pages, we can use Node.js and Harp, for example:
These are the steps. Abstract:
Create a New Repository
Clone the Repository
git clone https://github.com/your-github-user-name/your-github-user-name.github.io.git
Initialize a Harp app (locally):
harp init _harp
make sure to name the folder with an underscore at the beginning; when you deploy to GitHub Pages, you don’t want your source files to be served.
Compile your Harp app
harp compile _harp ./
Deploy to Gihub
git add -A
git commit -a -m "First Harp + Pages commit"
git push origin master
And this is a cool tutorial with details about nice stuff like layouts, partials, Jade and Less.
I was able to set up github actions to automatically commit the results of a node build command (yarn build in my case but it should work with npm too) to the gh-pages branch whenever a new commit is pushed to master.
While not completely ideal as i'd like to avoid committing the built files, it seems like this is currently the only way to publish to github pages and should work for any frontend Node.js app (or app built with a frontend framework like React or Vue) that can be served as static files.
I based my workflow off of this guide for a different react library, and had to make the following changes to get it to work for me:
updated the "setup node" step to use the version found here since the one from the sample i was basing it off of was throwing errors because it could not find the correct action.
remove the line containing yarn export because that command does not exist and it doesn't seem to add anything helpful (you may also want to change the build line above it to suit your needs)
I also added an env directive to the yarn build step so that I can include the SHA hash of the commit that generated the build inside my app, but this is optional
Here is my full github action:
name: github pages
on:
push:
branches:
- master
jobs:
deploy:
runs-on: ubuntu-18.04
steps:
- uses: actions/checkout#v2
- name: Setup Node
uses: actions/setup-node#v2-beta
with:
node-version: '12'
- name: Get yarn cache
id: yarn-cache
run: echo "::set-output name=dir::$(yarn cache dir)"
- name: Cache dependencies
uses: actions/cache#v2
with:
path: ${{ steps.yarn-cache.outputs.dir }}
key: ${{ runner.os }}-yarn-${{ hashFiles('**/yarn.lock') }}
restore-keys: |
${{ runner.os }}-yarn-
- run: yarn install --frozen-lockfile
- run: yarn build
env:
REACT_APP_GIT_SHA: ${{ github.SHA }}
- name: Deploy
uses: peaceiris/actions-gh-pages#v3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: ./build
Alternative solution
The docs for next.js also provides instructions for setting up with Vercel which appears to be a hosting service for node.js apps similar to github pages. I have not tried this though and so cannot speak to how well it works.
No, You cannot publish on Github pages. Try Heroku or something like that. You can only deploy static sites on github pages. You can't deploy a server on github pages.
No,
GitHub allows hosting only static websites(having only HTML, CSS, javascript).
Dynamic websites(having databases, servers, and all) can't be hosted as a Github page.
And node.js app is a server-based website, we can't host it on Github.
You can try Heroku, Openshift to host your website.
ahm. Yep, as most answer says. Github Pages only process html and css and a front-end JS.
But you can use JS framework like Gatsby which is mainly known to generate static purely static files, it gathers the data on compilation.
Then use that generated folder as the directory of the site.
I would like to add that it IS very much possible, as I am doing it right now. Here's how I'm doing it:
(I'm going to assume you have a package and/or directory ready to publish.)
In the root of your package.json, add
"homepage": "https://{pages-endpoint}/{repo}",
Where the pages-endpoint is the blah.github.io endpoint you specified in the Settings -> Pages portion of your repository, and repo is the name of your repository.
Then make sure you npm install --global gh-pages --save-dev. You need the --global to ensure the bin file is on your PATH and --save-dev should add it as a dependency in your package.json
After that, just npm run build && gh-pages -d build. The -d specifies your output build directory. The standard is build, but mine was public. If it's different, just change it.
Lastly, make sure in the Settings -> Pages section, you select gh-pages as the branch to host and leave the directory as / (root). Once it's built, your site should be available at your github.io endpoint.
Happy Dev-ing!
It's very simple steps to push your node js application from local to GitHub.
Steps:
First create a new repository on GitHub
Open Git CMD installed to your system (Install GitHub Desktop)
Clone the repository to your system with the command: git clone repo-url
Now copy all your application files to this cloned library if it's not there
Get everything ready to commit: git add -A
Commit the tracked changes and prepares them to be pushed to a remote repository: git commit -a -m "First Commit"
Push the changes in your local repository to GitHub: git push origin master

How to deploy angular 7 project directly to my webserver, so that i shouldn't migrate my dist folder always after deployment?

Currently, I am deploying my Angular 7 Project by using FileZilla to migrate my local files from ./dist folder to server public_HTMl. This is a quite tides job to carry in a daily basis, so I want to deploy my code directly to the server, when I hit ng build --prod then those compiled files have to be migrated directly to the server. Do anybody here who can help me to solve this problem?
To get rid of this problem I have tried lots of steps:
I was using bitbucket pipelines to execute my code, it came to be costly and I cannot run it as well, It took several hours but cannot give output for me.
I also tried by using new Git Repository even it's a good way but cannot give me the solution because when I deploy my code locally it creates a new folder every time while I execute a command.
So, I want to get help and deploy this code directly to the server that is going to host my application. Thank you all and hope all of you provide me good tips regarding this problem.
You can do it in the simple way. Create basic bash/sh script or windows executable and use rsync to do this automatically:
deploy.sh:
#!/bin/bash
ng build --prod
rsync -arvt ./dist remoteuser#remotehost:/var/www/remotedirectory
To avoid entering login and password every time add RSA public key to your remote machine(trusted host). You can combine this solution with Bitbucket pipelines, when the free plan ends I run this script manually from my developer machine.
bitbucket-pipelines.yml:
image: mycustomimage:latest
pipelines:
default:
- step:
name: Build and deploy to production
caches:
- node
deployment: production
script:
- npm install
- npm install -g #angular/cli
- ng config -g cli.warnings.versionMismatch false
- ./deploy.sh
Instead of a simple script you can use some more complex solution like Capistrano, Shipit or some other more advanced tool. All depends on your needs...
The simplest solution is always the best :)

Yarn: Procedure for redeploying JavaScript dependencies to Production Server (usage of `yarn.lock` file)

I've read the documentation on Yarn, and I know the lock file is supposed to be committed to VC. See this and which explains at a high-level why the lock file is necessary, and this which lists a bunch of commands without much explanation of what they actually do!
I've also read a lot of questions on StackOverflow which asks about whether the lock file should be committed to VC.
However, all the documentation and SO threads seems to overlook the detail that I want to know, which is the following; What is the correct procedure (the correct bunch of commands to run) for:
Updating the yarn.lock file when I need to (i.e. in the development environment where I want to pull the latest minor versions and update the lock file to reflect this)
For keeping my lock file in sync with other developers to ensure that they are developing/testing from the exact same dependency versions, and
For updating/re-synching the node_modules directory on the production server (i.e. to ensure that the production server isn't running on a different/breaking version of dependent packages)
I ask partly because in the past while doing a git pull on the server, I've faced messages telling me that the yarn.lock file has been updated independently of the development/VC process. As far as I'm concerned, this should never be allowed to happen.
the following information are based on what we use daily at Orange, this may be not the single truth.
1 ) Updating yarn.lock
yarn upgrade [package | package#tag | package#version | #scope/]... [--ignore-engines] [--pattern]
This command updates dependencies to their latest version based on the version range specified in the package.json file. The yarn.lock file will be recreated as well.
source : https://yarnpkg.com/en/docs/cli/upgrade
2) Dependency between developers
What i suggest you to do is to create a script that will check the current 'recomended' version to have with the help of:
yarn check
Verifies that versions of the package dependencies in the current project’s package.json match those in yarn’s lock file.
source : https://yarnpkg.com/en/docs/cli/check
3) Updating server production
The same as 2 ) using a git hook script, should help you to yarn check if the package.json version are correct if not launch a yarn update.
Honestly, this is a matter of opinion/preference. I have seen a few strategies:
Using yarn upgrade
Manually bumping the version in package.json before running yarn
Like Fabien mentioned: use yarn check
You can use yarn offline mirrors where you commit caches of your npm packages into version control. (See this medium article)
Plus there are lots of upsides when using yarn --offline:
Builds are faster because you don't have to fetch packages from the npm registry.
Your builds will fail if you do not have the right dependencies.

Can't get Grunt to run

I'm a little confused as to why I can't get my Gruntfile.js to run, here's the rub:
I installed grunt globally using npm. It lives in my /usr/local/bin/ directory, here it is:
Previously, I'd installed node.js using homebrew, then grunt with npm. Other issues led me to uninstall node via homebrew & reinstall node directly from the disk image node provides.
In my web project's index, there's a Gruntfile.js script that rebuilds my jekyll site everytime live-reload updates. When I run grunt, I get this message:
What I'm trying to wrap my head around:
Why isn't /usr/local/bin/grunt a valid path? Grunt exists at that location. My guess was that running grunt locally, from within my website's index, would fix things.
There's a node_modules folder there & everything was working fine before after all. I found this link, and tried running \grunt to bypass the bash alias, but that had no effect.
Any advice/suggestions are much appreciated! I feel like an imbecile using things, breaking things & not understanding why/how. Eager to finish my project, get a paycheck & finally have time to learn the ins and outs of terminal, bash & popular package managers so I don't run into these sorts of problems...
After discussion with OP, I find this is a Node.js environment issue. After install - do something - uninstall - reinstall in another way - do something, somehow, when npm install -g XXX is executed, the symbolic link is created and point to some place, but the package is installed some where else. That's why OP see /usr/local/bin/grunt but cannot run it.
I've recommended OP to clean up all Node.js stuff, make a clean environment and start right from the beginning.

npm overhead - how to handle this?

When installing anything via npm, it downloads dozens of not needed files. Usually I am looking for a library final build, a *.min.js file or anything like that but the rest is useless.
How do you handle all these useless files? Do you remove them by hand or generate the final app with any build tool like gulp or grunt?
I'm quite confused as I have plenty of npm modules installed in my webapp and the folder size is about 50 megabytes but it could be 2mb only.
npm install --production
Just doing an npm install brings in both development and runtime dependencies. You could also set the ENV to production globally for the server: npm config set production.
See this github issue. Note that this won't get you only the final minified build of everything, but will greatly reduce the bloat. For instance, a library might rely on babel-cli, babel-preset-es2015, and uglifyjs to be built (devDependency), but you don't need any of that if it also includes the transpiled minified file.
Managing Packages
For front end non-development packages I prefer Bower. It maintains the minified and non-minified version of your packages.
Build Tool
Use either Gulp or Grunt. Gulp would be my tool of choice.
Gulp task that will greatly improve your code are:
minification of both css and js
optimization/compression of images
concatenation and caching to reduce the number of calls to the server
package versioning
automatic injection of project dependencies
automatic injection of external dependencies
static analysis of js and css
automatic builds on code changes
deployment
testing
Node
If you can, leave to node all your development tools and leave to bower all your release plugins. Most node packages that are used in released apps have a bower installation counterpart.
Edit
Don't delete anything from Node manually as you don't know which packages have other packages as dependencies. If you are afraid that you may have junk in there, use npm rimraf to delete the node_modules folder, and then run npm install. Most importantly check your package.json for unnecessary saved packages.

Categories

Resources