I have a simple CLI application written in Javascript using Node that is for internal use by a small team. It runs in the Linux terminal as a CLI app. The app consists of a single ".js" file and requires a few Node packages. The problem I face now is how to deploy it to our internal team using a simple method that fits with our routine process of keeping end user computers updated.
Our app needs to be installed once per workstation / laptop and to be available to all users on that computer. Any user should be able to open a terminal and enter the command to run the app.
It seems a lot of people have discussed using Javascript for shell programming, but this issue of deploying the completed app is not widely discussed. I have not found anything on the topic. So far I have been recommended solutions that are appropriate for either development environments or web servers.
This app is not a web app and it is not deployed on a server. It needs to run offline. I am also not asking about developing or maintaining the app on a development workstation.
The installation process should ideally be as about simple as installing a shell script in /usr/local/bin and setting permissions so all permitted users on a computer can run it. We are looking for an installation method like this:
copy the Javascript file only once to each computer (to a location on the $PATH) and make sure the Node packages are available globally on that computer.
I specifically want to avoid having to do an npm install for each user account on each computer.
I also want to avoid having to update Node packages for each user account on each computer.
A developer will keep the app updated so it is always compatible with the latest version of the Node packages, and all computers it is deployed on will always have the latest versions of those packages installed.
One specific problem I encountered is discussed here, but the answers assume a different set of requirements (such as the need for "multiple applications running on different package versions").
For those requirements, if the actual problem is solving the EACCESS error (you should edit the question to include that information), then you should look at the permissions of all directories, and make sure that the user account that manages node packages on each computer has correct permissions.
One way to do that is to give /usr/local a special group, set the sticky bit with chmod (see man chmod), and use chgrp -R on the existing tree.
Then make the installing account a member of that group, and don't use sudo for npm install -g.
(Never using sudo for installations into /usr/local has the additional advantage that you can't accidentally install something somewhere else, for example because you didn't set paths in this local package source correctly.)
We are using these two approaches for similar deployments:
the programs live on a specific network mount. All users can run the same package from there. The developer only updates this package. No copying to local machines.
we use a simple deployment script which runs on all machines on logon. It pushes and copies the latest version to the local machine.
Related
I have an app in which I want to have a complete custom installer. I've looked at NSIS, Squirrel, WiX etc. But I want something fully customizable comparable with the Microsoft Teams and Discord clients.
My first idea was to separate the installer and the application so by downloading the installer and running it, a "oneclick installer" will silently install and look for updates with the autoUpdater, install the real client and remove the installed installer files, but I see a couple of problems here though: I want the user to be able to choose install directory, which I cannot figure out if that is possible through electron and the "autoUpdater" alone.
If NSIS or other can be made completely custom, please inform me because I cannot find anything about it that is of interest.
I want deploy a Meteor application on a wago industrial PLC 750-8202.
Wago supports Boards Support Package with the PTXdist tooling support (Communicate with CoDeSys program on a Linux-based WAGO PFC200 PLC).
I have no idea how i can utilize meteor on such a platform.
Have you any ideas how the steps can be realized to add meteor support for wago plcs ?
It is has linux on it, just SSH to your PLC. Make sure it is connected to your local network.
Now you can install node and everything else. It is like you have your own VPS. Configure everything. You can even install ftp server and upload your files over FTP. Or create small script that will be triggered from Github hooks and update your PLC as soon as you push changes to master branch.
My question is more on the lines of strategy than actual implementation.
And basically I'm wondering why do we build our MEAN apps on the server? And by build I mean getting components (npm install && bower install) and doing all the concat and minify stuff.
I'm trying to create my build system, and up until now I've been using a version of John Papa's build system, but my build is taking longer and longer on the server. So doesn't it make sense to just build everything locally and deploy it to the server? Or am I missing something?
Thanks
The build shouldn't happen on runtime. You have it partially right, build upfront, than deploy created artifact.
But the crucial idea is to have Continuous Integration in place. Meaning build server that is not on your local machine, which takes the code from SCM, builds it, run the tests, create deployable artifact and store it in some artifact repository (e.g. npm registry).
If you take it further and you also automatically deploy artifact into non-PROD envs, you are starting to dig into Continuous Delivery space.
If this build and deploy pipeline installs the artifact into PROD every commit, you are having Continuous Deployment working.
EDIT - reaction on comment:
The main idea is to have it continuous. Meaning, the full build is kicked off on regular basis, optimally every commit/git push.
If this is configured on you local machine and you are one man shop, that is probably fine. But as I played in my free time with various projects, I found that build on every commit may be resource intensive for my local machine and it was convenient to leave this responsibility for some third party service (especially when it's free).
There are plenty of online solutions for CI servers.
I used with success http://codeship.com and http://drone.io. http://cloudbees.com gives you hosted Jenkins. For open source projects they are free.
If you projects are not open source you will need to spend some bucks on it, but it should be cheap for one man projects.
Nodejs's npm is very handy. So I decide to use it for my company's project.
But the problem comes .My company forces us to develop in a closed network , I can only access internal web site and other stuff...
So I wonder how to solve this problem, when you want to use npm, but outside network is disallowed.
Any help will be appreciated. Thank all.
The only possibility you have is to use npm on a machine which can navigate and then copy the npm modules in the development machine.
Generally there are specific rules in the network that allow the team to do that things but still block another network sites.
How are you handling the downloading of third parties tools as Downloading google chrome, downloading an ide? Use the same system.
I guess that the system admin provide you those tools and/or enable specific rules so you can download it by yourself.
Otherwise you can download the npm package with your phone or another device with total network access and just copy it to your computer after that you must push it to the internal network repository otherwise your co workers should do the same thing to get the npm packages...
Or once you've downloaded it for the first time, point the npm package to local repository.
https://docs.npmjs.com/getting-started/what-is-npm
You will need to have the internet to download the packages initially and keep them updated as dependencies. Good Luck.
You can use URLs, local paths and even GIT URLs as your dependencies in you package.json. I don't see why you couldn't create a pseudo npm in your own network with those methods and then just allow other developers to npm install the dependencies from there.
Recently I was on holiday with limited internet connectivity. I was developing an application in node.js when I suddenly needed some NPM packages. This put a severe halt in the development and I was forced to wait until I could go online to download said packages and continue development.
Is it possible to mirror the whole npm registry locally on my computer? How to do that?
It should be possible seeing as online mirrors of the main registry exists. Where do they gather all packages from?
This is what npm-offline could do for you. npm-offline could cache modules, you would just need to create a script that ensures you have the modules that you want cached.