realtime with non-event programming - javascript

I am currently trying to make a browser plugin using crossrider that will sync bookmarks but as there is no "on new bookmark" event how do I sync all upload new bookmarks to the server and update bookmarks lists on other connected machines in realtime? I would be inclined to do this with a websocket but as I said it is not event based. So do I poll all the clients every n seconds? That seems like a lot of data being moved and it seems taxing on the server as the process presumably wold include: first requesting all the clients bookmarks every n seconds and then comparing them to a list of the clients logged bookmars and, in the event of a match, update the clients browser's bookmarks. So in short, what is the best solution to the conundrum?

Given the scenario you describe, I think that polling it the best solution.
If I correctly understand the intention of your extension, the one thing I would do differently is how you compare the bookmarks. In your solution, you mention sending the bookmarks data to the server, comparing, and sending data back to the client to update the browser bookmarks.
I think a better more efficient solution would be to compare the bookmarks on the client-side and send new bookmarks to the server. You can store a snapshot of the bookmarks for use when comparing using our local database appAPI.db.async API and our appAPI.request API to send the new bookmarks to your server.
We at Crossrider are here to help you with any issues or questions you might have. Hence, if you need any assistance, please do not hesitate to contact our support (support#crossrider.com).
Disclaimer: I am a Crossrider employee.

Related

AngularJS and MySQL real-time communication

I have built a web application using AngularJS (front-end) and PHP/MySQL (back-end).
I was wondering if there is a way to "watch" the MySQL database (without Node.js), so if one user adds some data to it, the changes are synced to other users too.
E.g. I know Firebase does that, but it's object oriented database and I am unable to do the advanced queries there like I do with SQL.
I was thinking to use $interval and $http and do ajax requests, so that way I could detect changes in the database. Well, that's possible, but it'll then do thousands of http requests to the server everyday and plus interpret php on each request.
I believe nothing is impossible, I just need an idea to do this, which I don't have, so that's why I am asking for a help here.
If you want a form of "real-time communication" you'll likely have to incorporate some form of long-polling from the client. Unless you use web sockets, but that's a big post about a bunch of different things. You're right to be concerned about bandwidth and demand on the DB though. So here's my suggestion:
If you don't have experience with web sockets then log your events in a separate table/view and use the pub/sub method to subscribe entities to an event, and broadcast that event to the table. Then long-poll against the watcher view to see when changes may have occurred. If one did occur then you query for the exact value.
Another option would be to use some query system with "deciders" that hold messages. Take a look at Amazon's SQS platform for a better explanation of how this could work. Basically you have a queue that holds messages and a decider chooses where to store the message using some hash or sorting method (to reduce run time). When the client requests an update, the decider finds any messages that would apply based on the hash/sort and returns them. Then you just have to decide how and when to destruct the messages.
The second option would require a lot more tinkering though, so it's really about your preference. I think what you'll find the difficulty to be is that most solutions have to deal with the fact that the message has to be delivered 1 or More times and you'll need to track when someone received the message and if it can now be deleted from the queue/event table or if you still need to wait. Otherwise you'll consume a lot of memory.

Node.js/Socket.io realtime webpage push updates

I am looking to implement/add realtime push notification updates from node.js server to browser (client).
I looked into socket.io (http://socket.io/docs/rooms-and-namespaces/)
business requirement is - users will visit a page displaying customer info & their orders. There will be ~10,000 users visiting the page at any given time (all 10,000 could be for different customers or sometimes a user may have opened the same page in 2 or 3 tabs)
When orders flow in to elasticsearch for a customer (my datastore) i want to push notification to users who have that customer's page opened.
Questions:
Is socket.io the correct framework for this case?
Am I correct in understanding I have to use socket.io' rooms functionality to implement this? (each room identifier equals customer ID?)
Is this implementation scalable and would it be memory intensive for 10k users on node.js server?
Thanks!
Yes, but you could consider socksJS as well and write your own simple back-end.
Yes, it's the easiest way if you need authentication.
Worst case scenario you'd need to cluster your socket.io servers and use a back-end adapter. Redis should be fast enough for 10.000 connections.
EDIT: memory will depend on your specific implementation.
Also consider https://github.com/Automattic/socket.io/issues/1393

How to handle this typical case of WebSocket usage?

I wrote a web page where there is a zone for user's comments.
Any authenticated users could post a comment.
As many users could post comments almost simultaneously, I want the comments list to be auto-refreshed.
Thus, I think about using WebSockets.
My thought are about a good/best practice for this use case:
Once a comment is posted, should WebSockets process read the current comments list on database and send a Json response containing all the new comments? This would allow the client to directly append the new comments on the DOM (JS).
Or should WebSocket just check the database (or queue if using a message queue (Redis, RabbitMQ etc..) for instance) and act just like: "Hey, I have new comments, click here if you want to see them !". This solution would only signal the presence of new comments, without bringing all those comments to the client. The workflow of retrieving the events would then involve by the client (by clicking on this sentence for instance) e.g using the traditional Ajax direction: client => server.
It is highly possible that a user posts a comment, then navigates to another page of the website. Therefore, a websocket response containing the whole new comments would be useless. A simple notification would then be possible, as most of known websites do for instance with the "+1" counter or more relevant to the "comments" scenario: "1 new comment available".
Which way should I choose?
I think to decide which data to push is mostly a matter of UI usability / user experience, as opposed to which technology is being used to interact with the server. We should avoid changing the UI with server pushed data in a way that would surprise the user in a negative way, for example having the comment feed constantly growing without any intervention from him.
But in the case of a realtime chart, it's probably better to push the data directly into the chart, that would be what the user expects.
In the case of the comment feed the reason why most sites go with the 'click to load' approach is because of user experience, so I think that is probably the best approach.
I use a combination of both....
In some pages the websocket communication contains the actual data--sort of like a stock ticker update.
And in other cases, the websocket communication just says -- all users viewing xyz data--refresh it. And then the browsers performs an ajax to obtain the new data and the grid is smartly refreshed in such a way that only the changed cells are modified on screen using innerHTML and the new rows are added and deleted rows are removed.
In cases like stackoverflow, it makes sense to show a message, "Got new stuff to show--want to see it?"
When I establish the websocket in the browser, I pass a page Id in the url and the cookies are passed too. So websocket server knows--the user cookie and the page which is being viewed.
Then in the database (or middle tier logic) communicates to the websocket server with messages such as: This message is for users viewing 'xyz' page: smartly refresh grid 'abc'. And the websocket server broadcasts the message.
Because the protocol allows you to pass anything you like, you have the ability to make it anyway you like.
My advise it to do what's best in each particular situation.

How to help protect for irregular internet connection in a Backbone application? Is there a solution like in Meteor?

I am building applications that are used on a touch screen in an educational environment. The applications gather data from user input. The data is then send to a server. There are multiple units, and whilst exact synchronisation is not paramount, the gathered data (along with other data collection from another source) will be combined and distributed back to the touch screen applications.
The applications are being build in Backbone with initial data loaded from a single JSON document. The JSON document is parsed from a remote MySQL database, which is downloaded (along with assets) on initialisation.
Whilst when possible the app should send new data back to the remote mySQL DB as soon as it is gathered, this may not always be possible and I need to collect the data so as to send it when I can.
My first thoughts are that storing everything in localstorage and syncing whenever possible (clearing the localstorage each time a successful sync takes place) is the way to go.
Over the bank holiday weekend, I have been playing meteor.js, and I think that maybe if I write my localstorage solution I will be reinventing the wheel, and a tricky wheel at that. It seems that Meteor.js has a way of mimicking a database offline, in order to fake instant updating.
My question is: How can I use a similar technique to add some offline protection? Is there a JS framework, or backbone plugin I can utilise, or a technique I can tap into?
You can use Backbone.localStorage to save your models and collections to the local storage, while the connection is offline.
Detecting if your user is offline is as easy as noticing that your xhr requests are failing. (https://stackoverflow.com/a/189443/1448860e).
To combine these two.
When you suspect the user is offline (an ajax request to your backend gets no response), switch Backbone.localStorage and store everything there. Inform the user!
When the user gets Internet connectivity again, save any changes from localStorage to the server. Inform the user again!
VoilĂ !

How would I save form fields offline if connection drops?

I have a backend where people may take some time filling out the form. The form is written to a temporary table every 5 minutes to store the data. The problem I have is that some peoples internet connections are not strong, so they drop at times without the person knowing and when they go to submit, the form can't because of loss of connection. Every 5 minutes I was going to add if the ajax fails, then prompt the user, but I want to go further and possible allow them to store the data offline and then reconnect to submit it.
The problem is, how would I start storing the data to a file on their local machine? As far as I know, client side scripting can't create files and we couldn't use ajax to call a remote file that saves to a local file. I suppose I could prompt to save the file at the beginning locally, but the local machine still would need to support the language. We are using JSP with MySQL.
Does anyone know of how I would accomplish saving data offline when a connection drops?
Implementing offline capabilities can become a tough exercise. Your question is too broad to provide a final answer, but you should have a look at HTML5 capabilities, an overview is available here. Using this feature requires your users to use a modern browser version. You have to think of synchronization/replication as well.
If the connection only drops for a limited amount of time, it may be enough to just use a rich-client and store the data in memory using JavaScript.
Something like this should do the work :
First way:
Use HTML5 localStorage/session storage, or something like Gear to
save the data on the client side
Make an ajax loop to ping the connection, idealy the server should be configured to reply with a http code 100 (I presume)
Send the data when the connection is on
Second way:
Use HTML5 offline capability or a substitue (see modernizr and polyfill)

Categories

Resources