I created an app using Javascript (with D3.js), JQuery, CSS, but no Node.js. It's your typical 'index.html' browser-run interface. I've been going through Docker tutorials and trying to figure out how to set my app up to a server, but I've had no luck accomplishing it and have only been able to find tutorials using apps with Node. I cannot, for the life of me, figure out what I'm doing wrong, but I'm wondering if the problem (or one of them) lies in my Dockerfile. Also, do I need to have used Node.js for all this to work? My app consists of the following:
A directory called Arena-Task. Within this directory, I have my index.html, my main javascript file called arena.js, and my CSS files. My other necessary files (like images, etc.) are located within two other folders in the same directory called data and scripts.
So now, how would I write my Dockerfile so that I can build it using Docker and publish it to a server? I've attempted following Docker's example Dockerfile:
FROM node:current-slim
WORKDIR /usr/src/app
COPY package.json .
RUN npm install
EXPOSE 8080
CMD [ "npm", "start" ]
COPY . .
But to be honest I'm not quite sure how to make the changes to accommodate my program. I can't figure out if a package.json is required because if it is, then don't I need to be using Node? I don't use any node modules or project dependencies like that in my app. Do I need to be? Is my problem more than just an incorrect Dockerfile?
Sorry that this question is all over the place, but I'm really new to the realm of the server-side. I appreciate any help! Let me know if I can provide any clarification.
lets clarify few things:
node and npm is when you need them, like you are using some npm packages.
package.json - is in use by npm - it stores installed package list in it.
For you case i don't see need of node. so you can create simple image and then you going to need simple web server - something which can serve you html/css/js files on web requests over http. the simplest i know would be nginx.
Also in Dockerfile you need to copy all you resources into image you are building.
that is what COPY package.json . was doing. but in you case you have to copy whole app folder into some app folder in docker image. (assuming app is a folder which holds all you files)
so we are going to have steps
Dockerfile should look something like this:
FROM ubuntu
RUN apt-get install -y nginx
COPY app app
COPY startup.sh /startup.sh
COPY ./nginx-default /etc/nginx/sites-available/default
no need in default commands because you going to start something else during docker run.
nginx-default is a configuration for nginx to work as webserver:
it should look something like this:
server {
listen 8080;
server_name localhost;
root /app
}
nginx is very flexible - if you need something from it google it.
docker image should do something all the time, otherwise image going to stop (some blocking process).
the easiest way i know is to create startup.sh file which going to start nginx as first step and then going to do infinity loop:
exit_script() {
trap - SIGINT SIGTERM # clear the trap
sudo service nginx stop
exit 1
}
sudo service nginx start
while sleep 20; do
CURRENT_TIME=$(date +"%T")
echo "app is running: $CURRENT_TIME"
done
exit_script - is a trap which helps to stop docker image in fast way, but not like terminate. but you can omit that for testing purposes.
finally, build image (docker build -t {your-image-name} .) and to start image use something like this:
docker run -p 8080:8080 {your-image-name} bash /startup.sh
that should work :), though most probably you going to face few errors because i was writing it from the head. (like you may need something else for nginx, or sudo is not installed by default in ubuntu lates image).
Related
I am trying to containerize my React application with docker and am now a little bit stuck with the server.js file as I do not know how to "implement" the application inside. I was able to simply show a "Hello World" but can't seem to run the application inside.
My idea was initially, that I could just use the index.js file as an entrypoint and use it in the "res.end(x)" as this was the place where I could simply enter the "Hello World".
Am I missing just something small or is my general approach wrong?
My server.js file looks as followed:
const http= require('http');
const fs = require('fs');
const index = fs.readFileSync('index.js');
let app = http.createServer((req, res) => {
res.writeHead(200,{'Content-Type': 'text/plain'});
res.end(index);
});
app.listen(9030, '0.0.0.0');
console.log('Node server running on port 9030')
I share with you this best documentation :
https://nodejs.org/en/docs/guides/nodejs-docker-webapp/
React is generally coded in .jsx files, which need to be compiled/transpiled into regular Javascript. Once this is done, you can serve these files to web browsers any way you like.
Understanding Build tools
React compilation/transpilation is done using Webpack or Parcel. If you're using Create-React-App, then you're using Webpack under the hood. CRA npm start will run a Webpack dev server, and npm run build will put all of the files you need in your build directory. You should do the latter if you want to
Avoid using Webpack dev server
Most tutorials, including the one linked in the comments, suggest using Webpack development server (aka npm start) to run your server in Docker. which is not recommended. Not only does it consume more resources to run, but the development build will run slower in browsers than the production build. The development server is also not meant to handle heavy loads.
Use build for production, with a real server
Use npm run build to generate the build files in the build directory, then use a real web server to serve them to the world.
One example in Docker, using Nginx to serve the files, can be found here
Alternatively the CRA build command itself offers a helpful hint:
The build folder is ready to be deployed.
You may serve it with a static server:
npm install -g serve
serve -s build
This will work in a Docker Node container. An example of a very not-optimized Dockerfile might look like this
FROM node:16-alpine
COPY . .
RUN npm install
RUN npm build
RUN npm install -D serve
CMD ["./node_modules/.bin/serve", "-s", "build"]
The build process can be improved here to leverage cache (by first copying package.json and package-lock.json and installing before copying code over, see example at the end of the answer here), and all of the RUN commands combined
I have an Amazon Elastic Beanstalk application currently running my NODE.JS app.
I have created some Queues with kue.js and Crons with node-schedule.
Since I have many commands to run the queues and crons, I find it impossible to put it on my current nodejs app.
I am willing to open a new application, the only problem is that I can only run one command.
I really don't want to open a seperate ec2 (not connected to my Elastic Beanstalk service) to run all of those.
What can I do to fix it?
Thank you very much!
As you want to use EB(Elastic Beanstalk) you could write a docker file for the application and EB will already detect that and ask you if this is a docker based project and it will take care of the rest, you just need write all the scripts that you need to run before your Entry point Command CMDnpm start like below
Dockerfile
FROM node:10.13-alpine
# Sets the working directory,and creates the directory as well.
WORKDIR /app
# Install dependencies.
ADD package.json .
RUN npm install
# Copy your source code.
COPY . /app
#Run all your scripts here or simply put them to a scripts.js and run it
RUN node scripts.js
# start app
CMD ["npm", "start"]
I'm in a big trouble. Working in a part time in a company they're looking for a new web technology to build "web-component" in their website.
They have started to use AngularJS (first version) and I told them that, with the recent evolution of this framework, it's not the right period of time to deal with it.
That's why I began to be interested in ReactJS. However, they don't have a node.js server infrastructure (and that's why AngularJS suits to them, only one browser is sufficient) so it's impossible to run it with something like "npm start".
SO ! My question is (as my post's title says...) :
Is it possible to run ReactJS without server side ?
I've tried with the following line in my header
<script src="https://unpkg.com/react#15/dist/react.js"></script>
<script src="https://unpkg.com/react-dom#15/dist/react-dom.js"></script>
But it remains a blank page.
Maybe there is something I don't understant in the react structure and that's why I'm looking for some help/explanations from you.
I hope I have been clear enough ! Thank you in advance for answer.
It is absolutely possible to run a React app without a production node server. Facebook provides an easy-to-use project bootstrapper that you can read about here
That being said, developers may need to use a node dev server locally via npm start, as well as using node to perform production builds via npm run build. But one can take the build output from npm run build and serve it from any static server and have a working react application.
For those who are getting 404's after deploying in a sub directory. Make sure to add the path in package.json as homepage.
"homepage": "https://example.com/SUB-DIRECTORY",
You should insert "homepage": "./" into your package.json file, then use building react-script command like npm run build.
I did it by using serve, as part of the build step in Jenkins. To install it, execute the command:
npm install -g serve
Then, to serve it:
serve -s build
Please, refer to project page for more information: https://github.com/zeit/serve
Now that I have completed my MEAN app below are what I think are the stages to get the app ready for production and up and running on Heroku. Could you please advise if I've got the wrong idea as this is my first app of this kind.
1) Use Grunt to lint all Javascript files (front end)
2) Concatenate all the JS files into one file
3) Uglify the concatenated file from step 2
4) Push (dist?) to Heroku (via Git) ... but what do I push?
Will there be files in a "dist" folder at this point?
Is it this directory (and only this directory) that should be pushed to Heroku?
Note: I'm confident with Git and Heroku - I'm not sure what I need to push or indeed what a typical workflow is.
Not sure what you mean by dist, but I can explain how to push to heroku
Make sure you have a package.json with an app name (Heroku will identify it as a node app)
Go to the command line and log in using heroku login Input your details
Create a file and call it Procfile inside the file: web: node app.js
heroku create appName
cd into/your/root/project/folder
'git init'
git add .
git commit -m "commit message"
git push heroku master (Make sure you have heroku as a remote)
There is a way more in depth explanation of this here: https://devcenter.heroku.com/articles/getting-started-with-nodejs#deploy-the-app But they show it with an example app. Don't forget about the Procfile. And about the dist folder, if it's not needed for the app you don't need to add it to the commit
A regular check for the production environment is to run speed tests, seo checks, and such to get the most out of it. You may want to look into canonical links, minify css,javascript,html etc..(serverside minification as well) You can also add domains with heroku which is explained here
I'd like to automatically restart the server after particular files are edited.
Is there anything I can install to help me do that? - or will I need to watch the folder run a script accordingly.
Any pointers appreciated
Use supervisor. Install it with npm install supervisor -g and launch your code with supervisor server.js and you should be good to go. Note that by default it keeps an eye on the files that are in the same directory as the server.js and it's subdirectories, but it should be possible to add additional paths.
You could use Nodemon for that, there is even a video tutorial on it.
https://github.com/mdlawson/piping is good too.
There are already node "wrappers" that handle watching for file
changes and restarting your application (such as node-supervisor), as
well as reloading on crash, but I wasn't fond of having that. Piping
adds "hot reloading" functionality to node, watching all your project
files and reloading when anything changes, without requiring a
"wrapper" binary.
Nodemon is good for it https://github.com/remy/nodemon
Also, if you want nodemon to restart your app only if particular files are changed, it is important to have .nodemonignore file in which you can tell changes to which files should be ignored by nodemon.
Example .nodemonignore file :
/public/* # ignore all public resources
/.* # any hidden (dot) files
*.md # Markdown files
*.css # CSS files
.build/* # Build folder
/log/*