I want to use Node.js run as a back-end server to serve front-end socket.io request.
The samples I found, seems can only run pages from Node.js server.
<script src="/socket.io/socket.io.js"></script>
The above js include is served by Node.js, how to include it if the front-end is a different application (e.g. PHP or static pages) ? Do I need to include all the dependent socket.io js libraries to make it work?
I'm currrently running apache/php alongside node.js/socket.io
If this is what you're trying to do, you can do it by serving socket.io on a different port than what apache is serving on (assumed 80).
In node.js, initialize the socket.io listener on a port 8080 for example:
var io = require('socket.io').listen(8080);
I believe, by default, socket.io will also serve its static client side files conveniently for you, such that in you html, you can:
<script src="http://yourhost:8080/socket.io/socket.io.js"></script>
Hope that helps.
It all depends on your server configuration. You may choose a path to Node.js backend that is not used by any other resource, and configure your web-server to serve everything else.
E.g. for Apache I used ProxyPass directive to enable connections to a local Node.js on a different port (to bypass local port restrictions), though may be not an option for your purposes:
ProxyPass /help/server/ http://localhost:8002/ connectiontimeout=1500ms
Related
I have a project with a client folder containing a React app bootstrapped with create-react-app. I also have a server folder with an express API server running on localhost:80.
Originally, I ran my frontend and backend as separate servers, making requests to the API server by making requests with fetch("url") and using "proxy": "localhost:80" in my package.json
I have recently altered the project so that my server serves the static frontend files like so: app.use(express.static(path.resolve(__dirname, "../client/build")))
I then tested running the server on two different ports 80 and 3000, and the frontend and backend both worked perfectly but I believed it would only work when the server was run on port 80
How does my frontend now know where to call the API server when it is running on different ports?
Proxy is used in the development environment and after that when you make build then its convert into the static HTML, CSS and JS files and that can be run in any port using node if you try that in a different port than it will also work as same
Proxy is not made for the production environment
https://github.com/facebook/create-react-app/issues/1087
I just made a simple chat-service with Node JS and I want to publish It " Online " ; at the time I used Ngrok and Localtunnel , but they are very limited , therefore I saw Apache web Server but I have not found tutorial on how to use it.
Thanks and hope you can help me.
Ngrok and Localtunnel are services which let you open a connection from inside your network to an external server which then forwards traffic back down the tunnel so clients on the Internet can make requests to your service running inside your LAN.
Apache is HTTP server software. It is nothing like Ngrok and Localtunnel.
While you can set up a reverse proxy using it, for that to use useful in this use case you would have to install it in your router … and most routers don't let you install software on them.
You could possibly run it on a computer inside your LAN and then configure port forwarding on the router … but if you are going to do that then you might as well forget about Apache HTTPD and just forward traffic directly to the service you've written using Node.js.
There are security risks and bandwidth considerations to take into account when running services from your LAN. It's almost always a better idea to just invest in a proper hosting service like Amazon AWS, DigitalOcean Droplets, or Heroku.
By "online" I suppose you mean to host it globally. For that my friend you will be in need of a server (preferably a cloude server) and a static IP address. Both of these are provided by a lot of providers like aws, digitalocean etc as a platform as a service, which we can leverage. So pls do the following:
Register for a cloud service (aws, digitalocean, gcp etc.).
Create a server instance of an operating system of your choice (my pref would be a linux instance).
Attach a public static ip to the server.
Log into the server. (SSH is the most secure way and most providers provide this to log into your server).
Install dependencies (in your case NodeJS etc).
Make sure that the port in which the app is hosted is open publicly. Most providers provide a dashboard in which you can configure port settings.
Use Apache or Nginx for configuring a reverse proxy (this is just for keeping your environment secure)
I am trying to learn how to use Node.js and web sockets to create simple multi-user interactive javascript programs. I used this tutorial series by Daniel Shiffman to create this example project. My next step would be to upload it, using WinSCP, to my RaspberryPi apache2 web server, but I haven't found a way to edit the code in a way to allow for that to work, and furthermore, I do not know what piece of the programs to execute to make it function properly.
Any assistance would be great. The extent of my Node / Socket.io knowledge comes entirely from the video series mentioned above, so you can assume I know almost nothing else.
Apache is a web server and it serves your file and send them to client for you, so when you have some client side things like html site with some css, javascript and images you can use apache to send them to client for you.
In node.js you can create this web server simply by following code and express library:
// Create the app
var app = express();
// Set up the server
var server = app.listen(3000, () => {
console.log('http server is ready')
});
as you created in your code too. by this web server you can host your files and do many more things like setup socket.io server and ... because you write web server yourself. with following code you serve static files in public directory (html, css, javascript and images ...):
app.use(express.static('public'));
after you finishing this process you can run it simply by:
npm install
node server.js
if you want you can run you code inside docker by creating Dockerfile and ...
About your question, you must move all your project files into raspberry and at the end you have following directory tree in somewhere in raspberry:
|- server.js
|- package.json
\ public
at this directory run above commands and your server will be up and running and you can access to it by http://raspberry_ip:3000.
I have a MEAN project in which the Angular front-end application is decoupled from the backend Node + Express + MongoDB application. Each application is committed to its own git repository and can be staged or deployed independently.
The problem I have is that the applications are being deployed from the same host and both applications desire to use the https protocol. Are there best practice approaches to allowing the two apps to use the same protocol with default port number 443?
One option that has been suggested is to use nginx to proxy to selected port numbers (eg., 3000 for the front-end and 3001 for the backend). Is this best practice or are their better options available?
Thanks,
I came across this post https://gist.github.com/soheilhy/8b94347ff8336d971ad0. I will try the approach suggested to see if it works.
This must be an extremely common problem. I've seen various answers for this but none seem to work for me.
I have node installed on an apache server on Windows Azure. My app is built and ready to go (snippet below):
var express = require("express");
var app = express();
//example api call
app.get("/api/example", function(req, res){
//do some process
res.send(data);
});
app.listen(8080);
console.log("App listening on port 8080");
Now, when testing on my own computer, I could then go to localhost:8080, which works great. But now I've put it on the azure server I can't get an external domain to point to it properly. So for example, I have the domain:
framework.example.com
I've added this to my hosts file in Azure:
XXX.0.0.01 framework.example.com
Initially I tried also editing the http-vhosts.conf to point the domain to the correct directory. This worked for loading the frontend, but the app couldn't talk to the backend. API calls returned 400 not found errors.
I've also tried an Express vhost method but think I'm doing it wrong and don't fully understand it. What is the correct method?!
My app structure is like this:
- package.json
- server.js
- server
- files used by server.js
- public
- all frontend files
So to boot the server I run server.js which runs the code at the top. The server.js uses the below Express config to point to the public folder.
app.use(express.static(__dirname + "/public"));
Adding it to the hosts file in Azure won't help. You'll need to configure your domain's DNS to point to Azure. I'd recommend using the DNS Name of your Cloud Service instance. Your underlying VM IP address could change if you need to stop it for some reason, but your Cloud Service DNS name is configured to always route to your underlying VMs. That means you'll need to setup a CNAME with your DNS.
Read more about that here: Cloud Services Custom Domain Name
Next, you'll either need to host the node app on port 80, or put a proxy in front of it to handle that for you. Otherwise you'll be stuck typing framework.example.com:8080 which is not ideal. On linux, you'll likely need to be a privileged user to host on port 80, but you never want your node app to have root privileges. You can use authbind to work around this problem.
See an example of how to use it with node here: Using authbind with Node.js
All that being said, it seems like you're somewhat new with linux server management. If that's the case, I'd strongly recommend trying to use something like Azure Websites instead of a VM. You no longer have to manage the virtual machine OS. You simply tell it to host your application and it takes care of the rest. If you're using github, this is incredibly easy to test and iterate with. It does host on Windows under the hood, and that might problems for some applications, but I host all my node sites there (developed on Mac) without any issues.