I am relatively new to programming and i have a general question about the relationship between the server and client sides of a react on any other js app.
I have a mysql db with a table that i expose as an api (every n seconds) with nodejs express running on aws instance. That api is pulled as json and displayed every n seconds by the react app.
In my head, the connection between sql and nodejs is separate from the connection between nodejs and react. I think of that the sql is only connected to one thing (node express server) and therefore is not under a heavy load ever. Then node express server exposes the sql table through a few queries as 3-4 jsons. And finally, lets say, a 100 people open my react app and pull those jsons. So the only loaded area of the server is the node express.
Im i correct? or do i completely misunderstand how this works?
Thank you in advance!
or do i completely misunderstand how this works?
It works the way you are going to make it work, and it seems you are on a good way.
The technique you are describing is called "caching" or at least some kind of, and is a good way to take load of your database. Instead of piping every request to the express server into the database, you store the result of the first request into memory (e.g. an object) of your express server. The next request will get served directly from memory, without asking the database.
Apart from polling you could use other communication channels too, but the same techniques would apply to avoid hammering the database
Server Sent Events
Websockets
Streaming (HTTP request does not close immediately, but server continues to send data every n seconds)
Related
so currently i am studying web development but the course was a little bit confusing, the teacher starts explain promise object and fetch then axios, after that he starts to talk about the "express" package to build server side, and am asking myself what is the different between using API objects & building server side using express?
Both of those things work together to create a website/app. axios, etc runs in the browser (client-side code), and is used to send requests to the server (express/other server-side code), allowing the server to fetch data, or modify the database, and return a response back to the browser side code, which continues operating from there.
imagine, client-side code does not have access to your server's database, as your database is on the server, and the browser is on the end user's computer. The two sets of code send messages back and forth to each other to create an app that has a centrally stored data store that is "shared" between users.
So I have a simple vanilla frontend, no frameworks because the site is so small. The site is a small webinterface so I can send dates to a database and load data to another database.
My coworker on the project have installed a bash script on another server, I have to run to start the loading data into a new database. Then the script writes to a file around every sixth second with an date I need to display on the frontend.
The backend is in java, and the frontend is pure html, css and vanilla js.
I have stumbled upon WatchService in java, which sounds like the thing I need. The problem is how do I send the data to the frontend, when it changes?
I could make a hack around it, with a setInterval in js, but isn't there a more natural/dynamic way?
This is a major fundamental problem that has many different solutions with different architectures. The simplest one is polling where the client keeps sending requests to server with pre-set time intervals. The other is "Long Polling" - the idea is that client sends a request to the server but server doesn't reply until some event happens that client needs to be notified of - so the server just holds that request unitl it needs to use it to notify the client and then the client sends new request to the server and so forth. Another solution is "Push notifications" Yet another one is SSE - Server side events. So just search the web on the terms mentioned here: Polling, Long Polling, SSE Push Notifications. This is NOT a full list
Use WebSockets
socket.on('change' callback())
Hope, it helps.
I have an app for which I have a backend in node. Suppose there is an API endpoint (/get_menu) to get all the menus.
When app calls /get_menu, I call external APIs of different restaurants and when all of them return their menu I am sending it to app. I am using Promise.all for that.
As some restaurants are taking a lot of time so I want to return the data back to app as I keep getting data from restaurants APIs. For example, if two of them return their menu instantly then I return those two back to app and after that as I keep getting data for other APIs I keep sending it to app.
What are some good ways to do that with single API endpoint i.e. /get_menu ?
There are a couple of solutions for this
To give you a literal answer for what you ask, you can use some
implementation of socket like socket.io to send data to your client
anytime you want.
The second option would be to persist the menu of the restaurants in
a database and serve the user from there. In the background,
periodically update your database with a cron. It's not really
advisable to keep your user waiting until all your menu API's are
resolved.
I would suggest to use redis based queue mechanism for this. You can push each API as a job in this queue.
There is a very good redis based queue npm package Bull. There are events to notified when a job processing is finised. You can then send data back to to the app once job is finised.
Implement a web worker on the client -side javascript that uses a UUID (unique ID to request that the server fetches the list of menus)
the server then will create a database entry assigning the various remote api requests each a unique id as well. Then, node.js sends a list of the request ids back to the client immediately and is rendered. A background task to fetch the menus is run separately, each time a result comes in, its stored in a Database. The client web worker will periodically request the list of UUIDs that it needs, and the server will respond with the ones that have responded. the data is deleted once sent to the client and the client renders the results and renoves that uuid from its list of requests to retrieve data for..
its essentially a crude unreliable result queue that can run safely as long as your server can keep up with all the requests.
Also see 'Event Sources' on the internet.
A more modern version of this involves a similar implmentation using websockets and a websocket server if your host is very reliable.
I am working on a home automation hub -- a Raspberry Pi running locally that displays weather info, controls my lights, etc. It is "networked" (and I use that term loosely) to a website via a shared MongoDB. Both the site and the hub are running Node.js/Express servers.
Essentially, I am looking to be able to enter text into a field on my website and then display it on my hub.
I'm struggling to figure out how to pass data between them. I can think of a couple ways that might get it done, but the only way I know I could get working is to implement some sort of Mongo watcher/listener to watch for changes on a specific collection. Essentially, you enter the text into the site, that updates the document in Mongo, the watcher informs the locally-running hub, which then fetches and displays the new content.
This seems hacky. Is there a better way? Is this something socket.io could manage? Maybe I'm overthinking it? Help!
You can use Socket.io, WebSocket or TCP socket to connect the two servers together and communicate that way. Or you can use a queue system like ZeroMQ or RabbitMQ and communicate that way. Or you can even make an HTTP request from one server to the other one every time you want it to grab new data - or you could even sent that data right in the request.
It would be much easier if you used Redis that supports pub/sub, see:
https://redis.io/topics/pubsub
or CouchDB that supports the changes feed:
http://docs.couchdb.org/en/2.0.0/api/database/changes.html
or RethinkDB that supports changefeeds:
https://rethinkdb.com/docs/changefeeds/javascript/
I don't think Mongo supports anything like that.
I recently started to play around with NodeJS - all I know is that it's a server side technology. What I did and want to accomplish are as following:
I have a MongoDB running on a remote server. I am using nodejs mongodb driver, and by simply doing the following I can connect to the database and just lets say create a document:
// main.js
var MongoClient = require('mongodb').MongoClient;
MongoClient.connect('mongodb://remote_url:27017/mymongo', function(err, db) {
var document = {a:"1", b:"2"};
db.collection('collection').insert(document, function(err, records) {
if (err) throw err;
}
}
As you know, the code above requires a console call such: node main.js, however I have a HTML5 frontend with several text fields, and I want to pass the fields to my database with a simple button click event. My questions are:
Is it really stupid if I directly connect to remote mongodb as above? Can I call the script from my HTML page? If I can, then what are the drawbacks compared to redesigning it into a client-server structure?
Finally, I think the right practice to accomplish above is to create an http server with nodejs on the remote server which passes client's requests to the mongodb driver. Am I right?
You could try building a REST API to interact with the MongoDB server(s) using vanilla NodeJS or your choice of quite a few additional frameworks. You might try baucis*.
*Disclaimer: I am the main author of baucis
Is it really stupid if I directly connect to remote mongodb as above?
No, using the native MongoDB driver is pretty standard. However, instead of connecting and then immediately interacting with your database, I'd structure the application so that you connect and then wait for HTTP calls or some other function to interact with the database.
Can I call the script from my HTML page?
Absolutely. In your node.js application, I would build in a web server that listens for certain HTTP calls. In your HTML, provide links or forms to GET, POST, etc. the web server that your application is listening on.
For example, your front-end would look like maybe a <form action="/document/add" method="post">. Then, keep main.js as your back-end code running on node.js, but modify it to listen for a POST call to /document/add. When a call to that URL comes in, run the insert code with the POSTed form data.
You could also create an AJAX solution to listen for form submission and submit the POST in the background, wait for a response, and update the page accordingly.
What are the drawbacks compared to redesigning it into a client-server structure?
Advantages and drawbacks are going to be very specific to the type of application you want to create.
I think the right practice to accomplish above is to create an http server with nodejs on the remote server which passes client's requests to the mongodb driver. Am I right?
You are correct. That is pretty standard practice for using node.js and MongoDB. I'm going to recommend Express for creating your API to interface with the database. It provides features such as URL routing right out of the box. However, you can build your own platform, or use any other one that works for your application/environment.
You should use a REST interface for MongoDB. I like sleepy.mongoose a lot. It runs on python and can take a minute to set up, but is well worth the effort.
Here is a blog by the author which helped get me started.
Here is the Demo App deployed to Heroku,
http://nodejs-crud.herokuapp.com/ and the tutorial link is http://codeforbrowser.com/blog/crud-operations-in-node-js-and-mongodb/