Updating Messages with Browser Pulling Messages from Server - javascript

I am tasked with creating a web page (think twitter) that updates when new messages are added to the database. When a message is removed from the database, it also must be removed from the client. It is possible that multiple clients can be accessing the same messages at the same time. Other actions can occur, such as a stop command issued on the server. Once this happens all the messages on the server will stop showing.
What I am looking for is an architecture for solving this problem.
Technologies that I am using are .Net 4.5, ASP.Net MVC and KnockoutJs. Nodejs could be used, but I’d need to know the benefit of using nodejs over using SignalR.
My currently implementation is using a javascript timer which is polling the server every 30 seconds for new messages. It works, but the polling feels dirty.

Can't comment on ASP.NET - but have used Node.js together with Knockout for this. I have used both WebSockets (via socket.io library) and also Server Sent Events (SSE) to push updates to the client model.
Sounds like SSE would be a good fit in this case. The key is whatever your database technology is should support emitting changes events to your node middle-ware so that you can send this to the browser.

After more research, the polling method is the optimal solution for the technologies in involved.
The crux of the problem is there is no notification of a new message, which would prompt a change in the system. Currently, a new message is received when it is committed to the database. SQL Server does not have a notification mechanism (this is not 100 percent true, but it not a dependency I wish to take on). In long the run, the optimal system would be to implement a publisher/subscribe motel using SignalR or nodejs which would deliver real-time messages to the client. For this to happen it would require a complete re-architecture of the application.

Related

How to continues send data from backend to frontend when something changes

So I have a simple vanilla frontend, no frameworks because the site is so small. The site is a small webinterface so I can send dates to a database and load data to another database.
My coworker on the project have installed a bash script on another server, I have to run to start the loading data into a new database. Then the script writes to a file around every sixth second with an date I need to display on the frontend.
The backend is in java, and the frontend is pure html, css and vanilla js.
I have stumbled upon WatchService in java, which sounds like the thing I need. The problem is how do I send the data to the frontend, when it changes?
I could make a hack around it, with a setInterval in js, but isn't there a more natural/dynamic way?
This is a major fundamental problem that has many different solutions with different architectures. The simplest one is polling where the client keeps sending requests to server with pre-set time intervals. The other is "Long Polling" - the idea is that client sends a request to the server but server doesn't reply until some event happens that client needs to be notified of - so the server just holds that request unitl it needs to use it to notify the client and then the client sends new request to the server and so forth. Another solution is "Push notifications" Yet another one is SSE - Server side events. So just search the web on the terms mentioned here: Polling, Long Polling, SSE Push Notifications. This is NOT a full list
Use WebSockets
socket.on('change' callback())
Hope, it helps.

Continue processing in backend if request is interrupted

Basically I'm working on an application that posts huge amounts of data to web API(third party API).
I am working on nodejs to connect to MsSQL server and fetch data, process it and post it in desired format to the web API.
Scenario is: In nodejs, I have script that does the above job. It is currently initiated (or should I say triggered?) by a button on web page using axios POST with all necessary parameters. Eg below:
axios.post('/api/v1/fetch-new-labors', {
startDate: 'somedate',
endDate: 'someDate'
}).then(...handles further processing & posting to api)
The process takes around 2-3 mins to finish. Meanwhile, if the page is refreshed, it obviously needs to restart the whole process by clicking the button.
Question: I am sure there is a way to let the process run(on serverside I presume) that takes care of fetching, processing and posting even if the client side page is refreshed/reloaded and at the same time, keep the client side informed with progressbar or x out of y records posted kind of thing. I thought of web sockets but I wasn't sure if there's a prefered way to achieve this. I'm not looking for the whole code/process, I am looking for someone to guide me towards overall concept/idea.
tl;dr: Long-running jobs usually avoid using the traditional request-response cycle, opting instead for some variation of the Pub/Sub pattern described below:
Accept request, then start processing
You should respond to the user immediately with an HTTP 202: Accepted, signalling you accepted the request, then start the processing.
You can perform some initial checks before responding (Does the user have other jobs? Does the request pass basic validation checks?) but you should not start processing the actual long-running job before responding.
You can use a simple HTTP request to create jobs.
Push status updates from server to subscribed clients
On page load the client subscribes to updates from the server.
Using WebSockets, push server-to-client status notifications regarding the progress. Don't forget to also handle and display errors to the client.
At this point, you'll probably need a way to uniquely identify each client across refreshes. You can easily solve this by storing a UUID via LocalStorage when a user first visits your app/website. If your app requires logins, then you can use the logged-in user's ID instead.
Check if user has already running jobs before accepting a new one
When the user refreshes you can send an initial message via WebSockets again notifying the user if he has any running jobs and what their progress is.
Based on your OP, I think you might want to disable the "Create Job" button if there's a pending job.
You can use other mechanisms for bidirectional server-client communication (such as long-polling/Server-sent Events) if you want, although WebSockets should be the most straightforward and flexible solution.
I'd personally go for a batteries-included WebSocket library such as socket.io.

Send data from web to a local server

I am working on a home automation hub -- a Raspberry Pi running locally that displays weather info, controls my lights, etc. It is "networked" (and I use that term loosely) to a website via a shared MongoDB. Both the site and the hub are running Node.js/Express servers.
Essentially, I am looking to be able to enter text into a field on my website and then display it on my hub.
I'm struggling to figure out how to pass data between them. I can think of a couple ways that might get it done, but the only way I know I could get working is to implement some sort of Mongo watcher/listener to watch for changes on a specific collection. Essentially, you enter the text into the site, that updates the document in Mongo, the watcher informs the locally-running hub, which then fetches and displays the new content.
This seems hacky. Is there a better way? Is this something socket.io could manage? Maybe I'm overthinking it? Help!
You can use Socket.io, WebSocket or TCP socket to connect the two servers together and communicate that way. Or you can use a queue system like ZeroMQ or RabbitMQ and communicate that way. Or you can even make an HTTP request from one server to the other one every time you want it to grab new data - or you could even sent that data right in the request.
It would be much easier if you used Redis that supports pub/sub, see:
https://redis.io/topics/pubsub
or CouchDB that supports the changes feed:
http://docs.couchdb.org/en/2.0.0/api/database/changes.html
or RethinkDB that supports changefeeds:
https://rethinkdb.com/docs/changefeeds/javascript/
I don't think Mongo supports anything like that.

Node.js chat without Socket.IO

I just started learning Node.js and as I was learning about the fs.watchFile() method, I was wondering if a chat website could be efficiently built with it (and fs.writeFile()), against for example Socket.IO which is stable, but I believe not 100% stable (several fallbacks, including flash).
Using fs.watchFile could perhaps also be used to keep histories of the chats quite simply (as JSON would be used on the spot).
The chat files could be formatted in JSON in such a way that only the last chatter's message is brought up to the DOM (or whatever to make it efficient to 'fetch' messages when the file gets updated).
I haven't tried it yet as I still need to learn more about Node, and even more to be able to compare it with Socket.IO, but what's your opinion about it? Could it be an efficient/stable way of doing chats?
fs.watchFile() can be used to watch changes to the file in the local filesystem (on the server). This will not solve your need to update all clients chat messages in their browsers. You'll still need web sockets, AJAX or Flash for that (or socket.io, which handles all of those).
What you could typically do in the client is to try to use Web Sockets. If browser does not support them, try to use XMLHttpRequest. If that fails, fallback to Flash. It's a lot of programming to do, and it has to be handled by node.js server as well. Socket.io does that for you.
Also, socket.io is pretty stable. Fallback to Flash is not due to it's instability but due to lack of browser support for better solutions (like Web Sockets).
Storing chat files in flatfile JSON is not a good idea, because if you are going to manipulating the files, you would have to parse and serialize entire JSON objects, which would become very slow as the size of the JSON object increased. The watch methods for the filesystem module also don't work on all operating systems.
You also can't compare Node.js to Socket.IO because they are entirely different things. Socket.IO is a Node module for realtime transport between the browser and the server. What you need is dependent on what you're doing. If you need chat history, then you should be using a database such as MongoDB or MySQL. Watching files for changes is not an efficient way and you should just send messages as they received.
In conclusion no, using fs.watchFile() and fs.writeFile() is a very bad idea, because race conditions would occur due to concurrent file writes, besides that fs.watchFile() uses polling to check if a file has changed. You should instead use Socket.IO and push messages to other clients / store them in a database as they are received.
You can use long pooling method using javascript setTimeout and setInterval
long pooling
basically long pooling working on Ajax reqest and server responce time.
server will respond after a certain time (like after 50 seconds ) if there is not notification or message else it will respond with data and from client side when client gets response client javascript makes another request for new update and wait till response this process is endless until server is running

Advice on which technology to use for real time notifications

I have X amount of activity sensors connected to a server that inserts data to a database everytime a sensor is triggered. What I'm trying to do is create a web interface with a blue print of the facility (svg) and whenever a sensor is triggered, besides the db insert, I want it to show some sort of alert in my blue print. For that I need to keep an open connection to the server I think.
I was thinking of using web sockets, but it might be overkill since I only need to retrieve data from the server. But running an ajax call every second doesn't sound very efficient either. Are there any other alternatives?
Thank you
Some potential choices include:
WebSocket
Adobe® Flash® Socket
AJAX long polling
AJAX multipart streaming
Forever Iframe
JSONP Polling
Which actual transport you end up using will depend on the your requirements for browser support and what technology you are using on the server to handle these requests. The transport choice may also depend on your network topology - what types of load balancers you need to integrate with, proxies, etc.
There are many libraries available on both the client and server sides, many of which support more than one of these transports.
For example (not an exhaustive list):
socket.io for nodejs
WebSocket
Adobe® Flash® Socket
AJAX long polling
AJAX multipart streaming
Forever Iframe
JSONP Polling
SignalR for an asp/.net backend
WebSockets
Server-Sent Events
ForeverFrame
Long Polling
Atmosphere for a java backend
WebSockets
Server Side Events (SSE)
Long-Polling
Forever frame
JSONP
IMO - Websockets is NOT overkill for this type of problem and would lend itself nicely to this type of application.
Without specifically discussing frameworks or knowing what is running in the backend of your server(s), we have a few options to consider for the frontend:
Websockets
Websockets are designed for bidirectional communication, although it is kind of shocking how many users are surfing the web in a browser that doesn't support websockets. I always recommend a fallback for this, such as the other methods listed below.
SSE
SSE is an HTML5 spec and is still shaky at best. Try scrolling on a page while when an SSE event fires... It may be a little easier on the backend, put it sometimes hangs on the client side since it runs inside the same thread that the DOM is running in.
Long Polling
Keeps your connection open. It doesn't scale well with PHP, but performs swimmingly with Python+Twisted on the backend, or Node.Js
Good Old Ajax
Keep your requests small, and you still have a scalable solution. Yes, a full GET request is the most expensive, but is supported in just about every browser rolled out the past ten years. It is also worth noting that GET requests are easy to scale horizontally with more hardware.
In a perfect world:
You would break up your application into a few components, operating behind a reverse proxy such as Nginx. Then use Node.Js + Socket.IO handle the realtime aspects of your app.
Another option would be to use small Ajax requests, and offer websocket support for the browsers that support it. This is advice specifically for PHP in the backend.
WebSocket is certainly not overkill. On the contrary. With websockets, you have a bi-directional communication channel; this means, that the server can initiate communication whenever it seems fit (e.g. when sensor data changes).
In a previous project, I have used node.js together with socket.io, to monitor 50+ sensors. Data was updated in real-time in a browser. The data was visualized using smoothie.js.
Whenever a sensor value was updated, it was communicated to the browser. Some sensors only updated once a minute, others once a second, ...
Polling would have been overkill, because it would retrieve all data for all sensors, even from those that were not updated yet.
I had a similar problem and did a lot of research on this. As I understand it, there are three main options:
Short polling: Have an endpoint that your javascript client pings every second. This is the worst option, because the pings add latency up to one second to your communication, and depending on how you implement, the endpoint could query the database every second, adding unnecessary overhead.
Long polling: Have an endpoint that your javascript client pings that holds the connection until a) the event occurs or b) the connection times out. If the endpoint returns a response, the client gets the event information. If the endpoint does not return a response, no event has occurred, and the client sends a new request. This is a good option because the events can immediately trigger the response to the client, assuming you have an asynchronous interprocess communication layer (like 0MQ) to send the message without any sort of polling.
Websocket: Have your javascript client connect to a websocket server, which will send a message to your client immediately upon the event trigger.
I think a websocket is your best option, because it accommodates immediate communication of the event without all the request/response overhead. And most importantly, this is exactly what websockets are designed to do! As such, you will probably have to write the least amount of custom code with this solution.
There are two great commercial services that might work for you.
Firebase - a javascript hierarchical database and realtime
messaging/ synchronization platform, uses websockets and has other fallbacks
PubNub - a real time message passing and queue system, uses websockets

Categories

Resources