We have a mobile app where the client gets a bunch of data from the server, stores and uses that data on a local db, and then syncs the new/modified data back to the server. My team is very concerned about the reliability of the data and want me to verify that syncing went correctly by first sending a "sync manifest". This would contain things like the number of rows that are to be sent so that can be compared with what the client actually stores.
My question is: Is there a point to doing something like this? If there is an actual error either when sending the request or storing the data we will get that error message. Is there a point to having this kind of extra verification when sending data and if so what would you look for?
Related
I am looking a proper solution to implement the below scenario.
I have a large amount of json data to process, previously the whole data is send to the server using rest apis and server perform certain action on each json row, after a very long time server sends the response back with the processing status of each row. but this approach always make the user confused, whether it is processing or not (because they are seeing the loading screen for over a 10 mins), since i am using rest apis I can't get any status while processing the data.
I am looking
Is there any good way to send the data in small batches ?
how can I get the status from server while processing the data ?
my frontend is ReactJS and backend is nodejs
As per your given situation. You can use the pagination and send the data in a limited number of data packets.
If the user wants more data he can click on the next or previous data. In this case, I would suggest ANTD library. It works perfectly for pagination. For your reference here is the link for the trial.
You can also refer to this codeSandbox by Andrew Guo.
https://codesandbox.io/s/m4lj7l2yq8?file=/src/index.js
https://ant.design/components/pagination/
AS you mentioned the data is taking too long, for this I may suggest indexing in the database.And you can show Suspence until the whole data is loaded. This may help the user to understand that the data is been loaded.
Here is the link for your reference
https://ant.design/components/spin/
I've been scratching my head and trying this for about a week now. So I hope I can find my help here..
I'm making an application that provides real-time data to the client, I've thought about Server-Sent-Events but that doesn't allow per-user responses AFAIK.
WebSocket is also an option but I'm not convinced about it, let me sketch my scenario which I did with WS:
Server fetches 20 records every second, and pushes these to an array
This array gets sent to all websocket connections every second, see this pseudo below:
let items = [ { ... some-data ... } ];
io.on("connection", socket => {
setInterval(() => {
io.emit("all_items", items);
}, 1000);
});
The user can select some items in the front end, the websocket receives this per connection
However, I'm conviced the way I'm taking this on is not a good way and enormously innefficient. Let me sketch the scenario of the program of what I want to achieve:
There is a database with let's say 1.000 records
User connects to the back-end from a (React) Front-end, gets connected to the main "stream" with about 20 fetched records (without filters), which the server fetches every second. SELECT * FROM Items LIMIT 20
Here comes the complex part:
The user clicks some checkboxes with custom filters (in the front-end) e.g. location = Shelf 2. Now, what's supposed to happen is that the websocket ALWAYS shows 20 records for that user, no matter what the filters are
I've imagined to have a custom query for each user with custom options, but I think that's bad and will absolutely destroy the server if you have like 10.000 users
How would I be able to take this on? Please, everything helps a little, thank you in advance.
I have to do some guessing about your app. Let me try to spell it out while talking just about the server's functionality, without mentioning MySQL or any other database.
I guess your server maintains about 1k datapoints with volatile values. (It may use a DBMS to maintain those values, but let's ignore that mechanism for the moment.) I guess some process within your application changes those values based on some kind of external stimulus.
Your clients, upon first connecting to your server, start receiving a subset of twenty of those values once a second. You did not specify how to choose that initial subset. All newly-connected clients get the same twenty values.
Clients may, while connected, apply a filter. When they do that, they start getting a different, filtered, subset from among all the values you have. They still get twenty values. Some or all the values may still be in the initial set, and some may not be.
I guess the clients get updated values each second for the same twenty datapoints.
You envision running the application at scale, with many connected clients.
Here are some thoughts on system design.
Keep your datapoints in RAM in a suitable data structure.
Write js code to apply the client-specified filters to that data structure. If that code is efficient you can handle millions of data points this way.
Back up that RAM data structure to a DBMS of your choice; MySQL is fine.
When your server first launches load the data structure from the database.
To get to the scale you mention you'll need to load-balance all this across at least five servers. You didn't mention the process for updating your datapoints, but it will have to fan out to multiple servers, somehow. You need to keep that in mind. It's impossible to advise you about that with the information you gave us.
But, YAGNI. Get things working, then figure out how to scale them up. (It's REALLY hard work to get to 10K users; spend your time making your app excellent for your first 10, then 100 users, then scale it up.)
Your server's interaction with clients goes like this (ignoring authentication, etc).
A client connects, implicitly requesting the "no-filtering" filter.
The client gets twenty values pushed once each second.
A client may implicitly request a different filter at any time.
Then the client continues to get twenty values, chosen by the selected filter.
So, most client communication is pushed out, with an occasional incoming filter request.
This lots-of-downbound-traffic little-bit-of-upbound-traffic is an ideal scenario for Server Sent Events. Websockets or socket.io are also fine. You could structure it like this.
New clients connect to the SSE endpoint at https://example.com/stream
When applying a filter they reconnect to another SSE endpoint at https://example.com/stream?filter1=a&filter2=b&filter3=b
The server sends data each second to each open SSE connection applying the filter. (Streams work very well for this in nodejs; take a look at the server side code for the signalhub package for an example.
The project that I am working on is to receive a request where in the main and/or most part of that request consists of data coming from a database. Upon receiving, my system proceeds with its function which is to parse all the data and ultimately concatenates the needed information to form a query, then insert those data using the mentioned query into my local database.
It is working fine and no issue at all. Except for the fact that it takes too long to process when the request has over 6,000,000 characters and over 200,000 lines (or maybe less but still with large numbers).
I have this tested with my system being used as a server (the supposed setup in production), and with Postman as well, but both drops the connection before the final response is built and sent. I have already tested and seen that although the connection drops, my system still proceeds with processing the data even up to the query, and even until it sends its supposed response. But since the request dropped somewhere in the middle of the processing, the response is ignored.
Is this about connection timeout in nodejs?
Or limit in 'app.use(bodyParser.json({limit: '10mb'}))'?
I really only see 1 way around this. I have done similar in the past. Allow the client to send as much as you need/want. However, instead of trying to have the client wait around for some undetermined amount of time (at which point the client may timeout), instead send an immediate response that is basically "we got your request and we're processing it".
Now the not so great part but it's the only way I've ever solved this type of issue. In your "processing" response, send back some sort of id. Now the client can check once in a while to see if it's request has been finished by sending you that id. On the server end you store the result for the client by the id you gave them. You'll have to make a few decisions about things like how long a response id is kept around and if it can be requested more than once, things like that.
I am looking for a way to use AJAX and jQuery to store data from one form in another without losing the values. I want to be able to keep this data away from the front end user and allow them to remove the information should they wish to. I need to be able to get this information out when the user submits the data. I would like to be able to store the values in an associative PHP array if possible, for example:
<?php
$information = array(
"first_information"=>array(
"name"=>"Sam Swift",
"age"=>21
),
"second_information"=>array(
"name"=>"Example Name",
"age"=>31
)
);
?>
I would have used a database for this but because of volume this will not be possible. I want to keep the data away from the user so that they have no access to it at all, the data should be held where the user has no way to see it, access it or change it. This is due to the nature of the data and all of it should be as secure as possible.
Any information that you store client-side is naturally going to be accessible and mutable by the client. If this sensitive data is data that the user is entering, then you really shouldn't worry about them manipulating the data (because that is what they are supposed to be doing). If however it is data that is being sent by the server - and never displayed or used in that form by the client - this is data that should never leave the server in the first place.
Ajax is not specifically a solution to this problem - whether you send the data asynchronously (i.e., piecemeal with Ajax) or as a full HTTP post is immaterial. You need to store the sensitive data on the server only along with a session ID to associate it with the client session.
Without knowing exactly what data you are storing nor what you are doing with it, it is difficult to advise you how to proceed. You should rethink how you are structuring your application if you are sending sensitive data for the client to work with. The client should only ever see the input and the results. The processing should be done on the server.
For example: perhaps your user is adding an amount to a bank balance. The user enters the amount on the client. but you don't want the client to see or be able to modify the actual value. You could send the balance to the client, perform the addition operation, then send the total back to the server. Far better would be for the client to send the amount to add to the server, which would then add the value to the balance, and return a confirmation for the client to display.
I am very new to angular and this one is striking in my head a lot. So scenario is : Suppose angular http returns me model containing array of object like:
[{name:"Ankur",lastName:"aggarwal",updation_date:"23-08-2014"},{name:"xyz",lastName:"abc",updation_date:"29-08-2013"}]
Out of this updation_date is not required but coming for some reason. So is it right to update the array with third object without creation date like {name:"def",lastName:"jbc"} . Is it a good practice or array object model should be consistent?
Also what should be the approach? Update the model array first so binding take place instantly, then send it to the server or send it to server and get the updated object? Might be basic one but very new to angular and JMVC.
Is it a good practice or array object model should be consistent?
It depends , if backend expects all array entries to contain updation_date then you have no choice and are forced to add some sensible default value. However, if possible then avoid sending too much unnecessary data from backend since it impacts application performance(like data transfer, adding unnecessary logic to generate sensible default values, etc.)
Update the model array first so binding take place instantly, then
send it to the server or send it to server and get the updated object?
If the nature of your application permits reverting model value when save is unsuccessful then just go ahead with
0.Perform data validation, and make sure valid data is supplied to the backend.
1.Update model.
2.Send data to backend
3.If something bad happens then execute error handling depending on app needs
However if presenting consistent value in the GUI is uttermost importance(e.g. finance applications) then
0.Perform data validation, and make sure valid data is supplied to the backend.
1.Show some message to user like "saving"
2.Perform ajax request
3.If successful, update model, else execute error handling depending on app needs
It depend on your error handling.
As saving on the server-side might be not successful, you should take it into consideration.
My approach is to
Update angular object immediately
Then send AJAX request to server and
Wait for response. If error happen during server save, you shoulde:
revert values,
repeat AJAX
show information to user.