I was thinking of building a web app which would be single page, using only javascript to update the page (no reloads). So I'm confused about what kind of setup to use.
I'm kinda new to both technologies, so I was wondering if you can set up nginx to serve html (+ js, css and other static resources) like a normal web server and then from those pages connect to a node.js websocket server (same host/ip) using something like socket.io?
Is this setup good or bad? What would be a better approach? What advantage would I get if I serve html pages in node.js and get static resources (css, js, images, ...) from nginx?
I dont think serving few images and static htmls from nodejs itself will ever be a bottleneck , ideally a front end proxy like nginx is required if you need to load balance between multiple servers and also for exposing you internal http services as https traffic. If you dont have that requirement it would be a overkill imho.
From various comments in the currently accepted answer, I wanted to note the following.
NodeJS itself does a pretty decent job of delivering static content, as good as nginx in many cases.
Trying to proxy a WebSocket connection is problematic at best currently, as most proxy code simply doesn't support it. Currently, it is best to use Node directly.
If/When you have a need to deliver static content separately, best to use another domain and a CDN at that point.
Related
I have a server set up using nginx to serve static files (used by two apps, one Django and the other AngularJS). My concern is the following:
If someone navigates to Example Domain (or any other static file) they can see its contents. For some reason it feels like bad practice to me that users can find all my stuff (if they know the URL or care enough to figure it out). Is this a legitimate concern or am I just being paranoid?
If this is a legitimate concern, is there a way to make it so that my server can serve my Django and AngularJS apps without the static files being visible through a browser?
Thanks
Alright, so I had my socket.io server listening on a different port, but in order to get it to work with https, I needed to have it listen without passing in a port (default). (It works fine on a different port loaded with http, but I need it to work on https)
My project was working fine, client could connect and send data fine. However, I moved the site over to my main domain, which has an SSL certificate. The site loads everything via https, so it couldn't load the http version of socket.io.js
However, now that I switched it to just var client = require("socket.io").listen().sockets; instead of listening on a different/specific port , it's still not working. Instead of giving me a connection error, it's not including the file at all.
My fear is that I'd end up needing to remake my whole site to host my files via node.js and I'd rather not have to do that.
I'm not using any other module than mysql-node and socket.io, and I'd prefer to keep it that way if possible. I am new to node.js, so I'm sorry if there's an obvious answer that I'm unaware of.
I looked around, however, and can't seem to find the answer anywhere. Or, at least a clear answer.
Would I be better off using websockets instead of socket.io?
If so, how would I go about doing this? I'd be more willing to remake my node application instead of remaking my site, honestly.
I am including the socket.io.js file in the client-side like so:
<script src="https://mysite/socket-io/socket.io.js"></script>
but of course, 404 since it's not an actual file that's on my apache server. There's no folder/directory named socket-io in my public_html directory, so that makes sense to me.
But, how can I get this to work? Would I need to host my files via node.js or would I be better off using HTML5 websockets? A fairly large demographic of my site's users use mobile devices, so I'd have to be sure it works on mobile as well.
If you're going to use apache to host the socket.io.js file, then you need to put that file on your Apache server at a path that it can be served from by Apache, just like any other web file that you want the Apache server to serve. Or, you can just serve socket.io.js from a public CDN also and use the public CDN URL. It's just a JS file. You can put it anywhere or use any URL that reaches a place where the file will be served from. There are some advantages to letting node.js and socket.io serve it for you because it guarantees that client and server socket.io versions are always in sync, but you don't have to do it that way.
If you are using node.js (which it sounds like you are at least in some capacity), then the socket.io built into node.js will serve the file automatically if you are using node.js to serve your web page too and you've configured socket.io to listen on the same port as your node.js web server. In that case, your webpage and socket.io will use the same port and both will run through the node.js server.
You haven't really explained why you're using both node.js and Apache, how that architecture works and why you're serving some of your site with Apache rather than just using node.js for the whole site as that is certainly the cleaner option with socket.io.
You can use plain webSockets if you want instead of socket.io, but then you will likely have to build some of the socket.io functionality on top of the webSockets (auto-reconnect, message passing, etc...) and using plain webSockets won't really simplify any of the Apache/node.js questions you have. It's trivial to serve up the socket.io.js file to the client using either Apache or node.js and once the client has the file, it is actually more work to use plain webSockets than to use socket.io because of the extra features that socket.io has already built.
I have a Django installation that I would like to run multiple variations of the same site: same data, different static content, with an ultimate goal of demonstrating XYZ as implemented with various JavaScript frameworks. I would like to have different home pages load, and those pull their own distinct static content. (All intended projects are SPAs.)
I tried the solution at How can I get the domain name of my site within a Django template?, but on my system the incumbent site doesn't give a hostname of 'pragmatometer.com'; it gives a hostname of 'localhost:8000', because Django / Gunicorn is serving up pages as localhost. I tried specifying in /etc/hosts that pragmatometer.com is 127.0.0.1 and having Apache proxy to pragmatometer.com, but that resulted in an error. That leaves open the prospect of running separate hosts on different ports, which should be proxied as distinct, or making the homepage redirect to a URL-specific landing page, a solution which would sacrifice the clean URL of xyz.pragmatometer.com to demonstrate the XYZ framework implementation. I'm seeing multiple ways of duct taping it with JavaScript, only one or two of which I would want a future boss to see...
I would ideally like to have multiple (sub)domains' root URL's pulling a subdomain-specific homepage and the /load/*, /save/* etc. consistent across them. I would also like to have the root URL's pulling their own CSS and JavaScript, but that's easy enough if I can get the root URL working appropriately.
The best solution I am seeing so far is having separate server processes listening on the same IP, but having isomorphic servers running on different ports and proxied by different Apache VirtualHosts. Either that or having JavaScript detect the URL and overwrite the page with the "real" index for the domain, which has a bit of a smell.
Comments about a better solution or how to execute the above intent well?
--EDIT--
Or another approach which might be a little cleaner:
Have a home image that loads the contents of /framework/ for each framework, and then document.write()s it after the page is loaded enough for a document.write() to clobber existing page contents.
If I used jQuery to clobber and load a page in this fashion, would it leave behind any pollution that would interfere with frameworks working appropriately?
Your stack looks kinda crazy.
You want one webserver with Django which can be accessed by multiple domains. Each domain causes the Django application to serve different content. Did i understand you correctly?
If yes, maybe you are successful by replacing Apache by Nginx. It can resolve the requesting hostname and decide how to redirect the request:
What's the difference of $host and $http_host in Nginx
Multiple Domain Hosting With One Django Project
Update
Relevant nginx documentation for distinguishing between different hostnames:
http://nginx.org/en/docs/http/request_processing.html
http://nginx.org/en/docs/http/ngx_http_core_module.html#server_name
Relevant nginx documentation for adding request headers:
http://nginx.org/en/docs/http/ngx_http_headers_module.html#add_header
Also see this answer:
Adding and using header (HTTP) in nginx
I'm doing a rich internet application (html/js/css) which has to communicate with a backend application server (RoR or node.js) through XHR/Websocket.
I wonder what the best way is to serve the RIA files to the web browsers: CDN or RoR/node.js as static file servers?
Does't the latter make it impossible for the browser to communicate with the backend server due to the same origin policy?
Thanks
Same origin policy applies to requests, not static files.
You are on www.test.com
$.get('api.someotherorigin.com/things.json', function(res){
// I'll get a same origin policy error
});
This is why people use getJSON/jsonp in these cases. It even applies to subdomains, depending on how things are set up.
A cdn has the benefits of serving your static files from a cookieless, often geolocation optimized source. You almost certainly don't need this during development.
The benefits later on are that you are likely going to have only a few servers (or just one) located in a spot that may favor people in one location and give a crappy RTT for folks not close. Additionally, your domain is going to likely have cookies for authentication, sessionid, etc etc -- if you use a cdn, you avoid sending these cookies along with every single subsequent request for static files, reducing the over all request/response sizes.
Just host the files yourself. You can serve static files quite easily using connect
connect.static
You may request popular JavaScript files from a cdn if you want to take advantage of caching. jscdn and google cdn are popular.
But your own personal HTML/CSS files should be on a static file server. (You can use something else like nginx to serve those through a sub domain if you want )
I'm creating an application in SproutCore, which I want to integrate with Facebook. To do this, I need to create a file called xd_receiver.htm at / (see http://wiki.developers.facebook.com/index.php/Cross_Domain_Communication_Channel ). But how can I do this (so it's available at /xd_receiver.htm)? I'm using sc-server.
Thanks.
I don't think it'd be possible to easily serve up the xd_receiver.htm file directly with sc-server, but you could put a proxy in your Buildfile (see "Setting Up Your Proxy" at https://sproutcore.pbworks.com/Todos+06-Building+with+Sinatra+and+DataMapper), and point it at some other web server you have (e.g. Apache) serving up xd_receiver.htm.
For production it's usually best for performance reasons to build your SproutCore app (using sc-build) and then host the static files with something that's good at that like Apache. Then you could just configure your web server to provide the SproutCore app and the xd_receiver.htm at the appropriate URLs.