Ember - Force https with heroku - javascript

I've made a ember app deployed on heroku. Heroku provides me an ssl certificat so https is working on my website.
I want to force visitors to uses https. I've found some answer telling to do-it in a client side, but since the client can modify JS he will be able to pass-by the force https.
I'm thinking about doing it in a beforeModel of the ember app.
What's the best approach ?
Many thanks

So guys,
I was able to force HTTPS by adding a static.json in my root folder of the app.
And in this static.json just add
{
"https_only": true
}
commit, push to heroku and that's it !

Related

export my localhost server ( node js ) online with a web server like apache

I just made a simple chat-service with Node JS and I want to publish It " Online " ; at the time I used Ngrok and Localtunnel , but they are very limited , therefore I saw Apache web Server but I have not found tutorial on how to use it.
Thanks and hope you can help me.
Ngrok and Localtunnel are services which let you open a connection from inside your network to an external server which then forwards traffic back down the tunnel so clients on the Internet can make requests to your service running inside your LAN.
Apache is HTTP server software. It is nothing like Ngrok and Localtunnel.
While you can set up a reverse proxy using it, for that to use useful in this use case you would have to install it in your router … and most routers don't let you install software on them.
You could possibly run it on a computer inside your LAN and then configure port forwarding on the router … but if you are going to do that then you might as well forget about Apache HTTPD and just forward traffic directly to the service you've written using Node.js.
There are security risks and bandwidth considerations to take into account when running services from your LAN. It's almost always a better idea to just invest in a proper hosting service like Amazon AWS, DigitalOcean Droplets, or Heroku.
By "online" I suppose you mean to host it globally. For that my friend you will be in need of a server (preferably a cloude server) and a static IP address. Both of these are provided by a lot of providers like aws, digitalocean etc as a platform as a service, which we can leverage. So pls do the following:
Register for a cloud service (aws, digitalocean, gcp etc.).
Create a server instance of an operating system of your choice (my pref would be a linux instance).
Attach a public static ip to the server.
Log into the server. (SSH is the most secure way and most providers provide this to log into your server).
Install dependencies (in your case NodeJS etc).
Make sure that the port in which the app is hosted is open publicly. Most providers provide a dashboard in which you can configure port settings.
Use Apache or Nginx for configuring a reverse proxy (this is just for keeping your environment secure)

Deploying react js and node js full stack on AWS production?

I have currently deployed the React and Node.js on nginx which sits on AWS . I have no issues in deployment and no errors.
The current environment is: PRODUCTION.
But I have a doubt whether the method I follow is right or wrong. This is the method I followed, https://jasonwatmore.com/post/2019/11/18/react-nodejs-on-aws-how-to-deploy-a-mern-stack-app-to-amazon-ec2
The following is my nginx configuration
server {
listen 80 default_server;
listen [::]:80 default_server;
server_name _;
# Load configuration files for the default server block.
include /etc/nginx/default.d/*.conf;
location / {
root /var/apps/front_end/build;
try_files $uri /index.html;
}
location /api/ {
proxy_pass http://0.0.0.0:3005/;
}
As shown above , I have copied the build folder after npm run build to the AWS instance and gave the location to nginx and the backend is copied as it is to the AWS instance and I gave npm start it runs on 3005 port , I gave that IP to /api location to proxy pass
I see a couple of others using server.js for the front end and putting the build folder files there and setting up the nginx to that server.js .
So should I do it that way ? or am I good with the current method as seen in the link above ?
Just like everything else, there are multiple ways to go about this. Depending on the way you have ended the question looks like you are open to exploring them.
Here are my preferences depending on the increasing order of responsibilities on my side vs what AWS handles for me:
AWS Amplify :
Given that you are already using React and Node, this will be a relatively easy switch. Amplify is a not only a set of very useful frontend framework by makeing it easy to add functionalities like Authentication, Social Logins, Rotating API keys (via Cognito and API Gateway) etc but also backend logic that can be eventually deployed on AWS ApiGateway and AWS Lambda. Not only this but AMplify also provides a CICD pipeline and connects with Gothub.
In minutes, you can have a scalable service, with opetion to host frontend via AWS CloudFront, a global CDN service or via S3 hosting, deploy the API via ApiGateway and Lambda, have a CICD pipeline setup via AWS CodeDeploy and Code Build and also have user management via AWS Cognito. You can have multiple enviornments dev, test, beta etc and have it setup such that any push to the master branch is automatically deployed on the infra, and so on and so forth other branches being mapeed to specific enviornment. To top it all off, the same stack can be used to test and develop locally.
If you are rather tied down to use a specific service or function in a specific way, you can build up any of the combination of the above services. API Gateway for managing API, Cognito for user management, Lambda for compute capacity etc.
Rememebr, these are managed services so you offload a lot of engineering hours to AWS and being serverles means you are paying for what you use.
Comming to the example you have shared, you don't want your node process to be responsible of serving static assets - its a waste of the compute power as there is no intelligence attached to serving JS, CSS or images and also because in that case you introduce a new process in the loop. Instead have NGINX serve static assets itself. Refer this official guide or this StackOverflow answer.

Using https with decoupled front-end and backend MEAN applications that run on same host

I have a MEAN project in which the Angular front-end application is decoupled from the backend Node + Express + MongoDB application. Each application is committed to its own git repository and can be staged or deployed independently.
The problem I have is that the applications are being deployed from the same host and both applications desire to use the https protocol. Are there best practice approaches to allowing the two apps to use the same protocol with default port number 443?
One option that has been suggested is to use nginx to proxy to selected port numbers (eg., 3000 for the front-end and 3001 for the backend). Is this best practice or are their better options available?
Thanks,
I came across this post https://gist.github.com/soheilhy/8b94347ff8336d971ad0. I will try the approach suggested to see if it works.

running apache and rails on an ubuntu droplet

at first i installed apache, and i built up some sites.
Then later I tried to install rails according to this tutorial, since it's made by my hosting company.
I originally wanted to install rails in a sub-directory, so that most of the pages would be served up by apache and I could just build up some special apps with rails.
At this point I've corrupted the apache pages which heretofore were working just fine.
Should I just uninstall everything and start over?
But, what is the issue, is this how rails is supposed to work? Is it hard to make it compatible with apache? Can a server run only rails, would that be easier to manage?
I also had some javascript templates running off that apache server in the beginning.
Are there some particular log files I can investigate to discern what's at fault? Where are they located?
I can't help by providing a decent tutorial on rails + apache but I can help on nginx + rails + zero downtime.
Typically the webserver apache, nginx serves as a proxy webserver. They send the request from the user through a unix domain socket to the rails application running as a separate process.
I've written a detailed tutorial on setting up a deployment server from scratch.
Setup a vps
Securing vps + nginx
Deploying the rails application
Hope it helps !

Optimize express.js app - keep-alive is not enabled

I have express.js app hosted on ubuntu server via node-http-proxy (https://github.com/nodejitsu/node-http-proxy)
I've run the app on http://www.webpagetest.org/ and receive warning that I should enable "Keep-alive" in my app.
I need advice from someone experienced in that topic, how can I enable it?

Categories

Resources