I currently own a website, which will be used for global access and a database. I am also building devices that run a local node.js server, that forms a connection to this website.
So I guess this would be a reverse websocket? I don't own the webserver, it's hosted, so I'd assume I would use php.
I'm needing a 2 way connection that can push or request updates from both ends... Maybe websockets isn't the answer in this case?
Related
I am working on an automation project
I have a local server inside adobe cep. It's a node js/express server.
I want to be able to send an API request to that server from a cloud server.
How can I connect my local server to the web so I can run an HTTPS request that will arrive at my local server?
Thank you very much for helping with this
I didn't really know where to start with this, searched online but didn't get any results yet
This is a two steps configuration, you want to call a local server from the cloud, so:
first of all you need to know your IP (if dynamic it may change)
or you want to use a service like dynDNS so you can associate an ip (192.1.2.3) or an URL (http://myLocalserver) that is callable from the web.
Additionally, you need to setup the port forwarding in your rooter configuration so you can connect your local server (localhost:4200 for instance) to http://myLocalserver:4200
I have a neo4j desktop (1.4.3) database on my Windows PC. in an html code, I am connectecting to the DB using
const driver = neo4j.driver("bolt://IP_ADDRESS:7687", neo4j.auth.basic("neo4j", "PASSWORD"));
After that I query the DB and display the results on the web page (I use leafletjs maps, but this is not the issue)
var session = driver.session();
session
.run(`MATCH....etc.... return ....
`)
.subscribe({
...... etc
Everything is fine so far. I run the page on my PC or from another PC in my home network, everything is fine. The setting of neo4j is (dbms.default_listen_address=0.0.0.0) no issues there.
The question is how do I expose this page to the colleagues outside my network?
Using noip.com, I got a temporary domain mapped to my external IP.
I also configured the router to forward port 80.
But when the page Javascript gets loaded on an external client, it tries to connect to neo4j on that client. When I put the external IP addtess in "const driver ..." the connection does not work.
How do I make the connection to the DB from my server, but the queries to the DB come from the client who loaded the Javascript?
Edit: Forgot to mention that I am also using Apache Web Server (Xampp) to serve the page to remote users.
A simple architecture that does what you want, plus mitigates the risk of opening up your database to everyone uses a HTTP server + API that are accessible via your noip provider.
Your public facing frontend (HTML + JavaScript (for making API calls etc)) makes the HTTP(s) calls to your publicly accessible API (for example a nodejs server) to make the database calls. Cypher/a direct database connection to neo has no place in your users' browsers.
You can also use a starter like the GRANDstack.
I'm new to Web Sockets in general, but get the main concept.
I am trying to build a simple multiplayer game and would like to have a server selection where I can run sockets on multiple IPs and it will connect the client through that, to mitigate connections in order to improve performance, this is hypothetical in the case of there being thousands of players at once, but would like some insight into how this would work and if there are any resources I can use to integrate this before hand, in order to prevent extra work at a later date. Is this at all possible, as I understand it Node.Js runs on a server and uses the Socket.io dependencies to create sockets within that, so I can't think of a possible solution to route it through another server unless I had multiple sites running it separately.
The first question I have is this:
Are you hosting on AWS or in a local datacenter?
The reason I ask is because SOCKET.io requires sticky sessions to work properly across multiple servers. Due to the fact that SOCKET.io will attempt to upgrade each connection, and because that upgrade request must reach the original server that authorized the session, you'll need to route websocket (TCP) connections back to that original server via sticky sessions. Unfortunately AWS makes this extremely tricky and will require you to learn how to:
A) Modify elastic load balancer policies to forward protocol information
B) Split apart TCP connections from standard web requests using something like HA PROXY or NGINX. This is necessary in order to handle web socket UPGRADE requests properly, as you will be setting TCP to sticky and web requests to round-robin.
C) Attach your socket.io configuration to a common storage source, like Redis (elasticache).
Once you've figured out what's needed for AWS (or if you've got full control over request routing at your local datacenter), you'll want to architect your SOCKET application to use multicast rooms rather than direct socket messaging.
Example:
To send a message to users in game #4444, emit a message to room 'games:4444', rather than direct to the user's socket.
If your socket instance is configured using REDIS, REDIS will automatically take care of maintaining lists of people who are connected to your 'games:4444' channel. Otherwise you'll need to maintain the list yourself using a database or other shared mechanism.
Other than that, there are plenty of resources online that can help you figure out each step along the way. I'd start with understanding something like HA PROXY and how it can help split apart your SOCKETS from your web requests.
I am making a web-app using JavaScript. I plan to use Node.js to connect the app to an existing MySQL database.
First of all, will the Node code be written in the same .js file as my application? Or is it a separate file?
I need the data to be current at all times (even if you were to close the browser and re-open it, AND even in the event of the user not having a wifi connection), so my thought was to constantly update the local device's db and then to intermittently update the MySQL db. Is this the best strategy? If so, how exactly can Node talk to the offline db and MySQL?
First of all, will the Node code be written in the same .js file as my application? Or is it a separate file?
It is possible to keep your client side JavaScript in the same file as your server side JavaScript, but it doesn't make any sense to do so. They are separate programs. (Library files, on the other hand, are a different story).
so my thought was to constantly update the local device's db and then to intermittently update the MySQL db.
Working with a local database and syncing to a shared one is a common strategy. You do need to handle conflicting updates in a way that is sensible for your purposes though.
If so, how exactly can Node talk to the offline db and MySQL?
Node.js can't talk to the offline database, at least not directly.
You will have a web application running in the browser. It will use client side JavaScript with a client side database and some means of communicating with the server (often this is done by sending JSON over HTTP to and from a web service).
Then you will have a server side application running in Node.js. It will use server side JavaScript with a server side MySQL database and some means of communicating with the client (i.e. an HTTP server hosting a web service).
The case is quite straight forward (in my mind). The client has a native application running on his machine that produces pairs of values. What I am trying to accomplish is when this service is running and the user visits my web service I want to be able to retrieve these pairs of values with JavaScript code on the clients browser. I haven't decided because I am not sure what kind of server should create the pairs or how to grap them with JS. I have tried using pubnub to setup a channel of communication but the round trips are kind of slow.
Any suggestions?
You could access a local http server (LAMP / WAMP) for AJAX calls with javascript via the 127.0.0.1 or localhost addresses.