Our website provides various data services to our clients; one of which is gauge data. Some gauges log information every 15 minutes, some every minute. This data is sent to our SQL database.
All of this data is displayed via a graph (generated server side via PHP and JPGraphs) with each individual log entry being displayed as a row in a collapsible table (jquery 1.10.2).
When a client wants to view the data, they select a date range and which gauges they would like to view. If they want to view the last 3 days of a gauge that logs every minute then it loads pretty quickly. If they want to view 2 of those then it takes around 15-30 seconds to load. The real problem comes when they want to view a months worth of data; especially more than 1 gauge. This can take upwards of 15-20 minutes to load and the browser repeatedly asks if we want to stop the script from populating the collapsible table rows(jquery).
Obviously this is a problem since clients want a relatively fast response (1-5 min max). Ideally, we would also like to be able to pull gauge data from several months at a time. The only way we can do that now is to pull data 2 weeks at a time and compile the total manually.
For reference: If I wanted to pull a months data for 2 of our once-a-minute-logging gauges, then there would be 86,400 rows added via jQuery to a collapsible table. The page takes approx. 5 minutes to load and the browser is terribly slow during this time period.
My question is: What is the best way to pull/graph/populate data using a PHP based server (Symfony 1.4 framework) and javascript?
Should we look into upgrading our allotted processing power/RAM(we are hosted by GoDaddy)? Is there a faster way to populate collapsibles than with jquery? All of our calculatoins are done server side. Should we just pull the raw data and let the client side do the data processing? Should we split the data processing between client and server?
Here's a screen shot of the web page. Its cropped so that more client-sensitive information is not displayed:
In response to my comment.
Since you need the entire data-set only on the server side (you create your graph on the server), this means that you don't actually need to send the entire data-set to the client.
Instead send a small portion to the client. Let's say the first 200 results. Then you can go ahead and cache the rest of the result-set into a JSON file (lite database, whatever you want really). Then create an interface where the user can request for more data. Infinity scroll is nice but has its own problems. Maybe just a button that says load more data. As people have said anything more than a few hundred data points in a table at one time is crazy to have because people won't look at it anyways. Then when they hit the button to get more data, you send an AJAX request to the server with the correct parameters for what data you want.
For example the first time they click getMoreData() you want to get the next 200 data points. So you send getMoreData(start=200, length=200). Your server picks up the AJAX request and finds the correct data in the JSON file or the lite database, wherever you have cached the results. And the user can keep requesting more data (making sure you update your start parameter), and you only ever return a small subset. The user doesn't even realize that they don't have the whole data-set there in front of them because it looks like they do.
One that is complicated about this is sorting and searching. If you want to implement those then you need to make sure you go to the server side and sort/search through the cached results.
So basically you have a system where you can create the entire graph on the server side which shouldn't take long. What does take long is the loading of the entire data-set to the client side. So you break up that up into small chunks. You can even easily create pagination and the such with this method.
Related
I am looking a proper solution to implement the below scenario.
I have a large amount of json data to process, previously the whole data is send to the server using rest apis and server perform certain action on each json row, after a very long time server sends the response back with the processing status of each row. but this approach always make the user confused, whether it is processing or not (because they are seeing the loading screen for over a 10 mins), since i am using rest apis I can't get any status while processing the data.
I am looking
Is there any good way to send the data in small batches ?
how can I get the status from server while processing the data ?
my frontend is ReactJS and backend is nodejs
As per your given situation. You can use the pagination and send the data in a limited number of data packets.
If the user wants more data he can click on the next or previous data. In this case, I would suggest ANTD library. It works perfectly for pagination. For your reference here is the link for the trial.
You can also refer to this codeSandbox by Andrew Guo.
https://codesandbox.io/s/m4lj7l2yq8?file=/src/index.js
https://ant.design/components/pagination/
AS you mentioned the data is taking too long, for this I may suggest indexing in the database.And you can show Suspence until the whole data is loaded. This may help the user to understand that the data is been loaded.
Here is the link for your reference
https://ant.design/components/spin/
I've been doing some research about infinite scrolling and came across what people call "Lazy Loading". Now I have done it on one of my website elements (chatbox), and now I have 2 ways to do it. but I can't decide which one is more efficient than the other. Here are my ways:
Let's say I have 1,000,000 rows of data from database which I need to fetch all
1st way:
Load content from database, truncate it on the server-side code(PHP) then only show the first 50.
Upon user scroll on the page, another request will be sent to fetch the results again and display the next 50 and so on and so forth..
2nd way:
Load content from database, render it in my HTML as hidden elements but only displaying the first 50, then upon user scroll, show 50 more hidden elements.
The 1st way is requesting from the server whenever the need to display more results arises.
The 2nd way just does 1 request from the server, then hides the result except for the first few that should be shown.
I have no problem doing either of the two.
Now the dilemma, is that the 1st way is sending more HTTP requests, and the 2nd way (though sending only 1 HTTP request) is fetching huge data in a single request, which can be slow.
Which method is "better", and if there are other options, please let me know.
I know this is old question but want to give my opinion:
1st way
Is always preferred specially with that amount of rows, it is much more efficient to only request a set number of rows and if user wants to see more e.g click in next page another request is made to the server to fetch the next page, the response time will be much better, it will also make it easier to the client to manipulate the list if there is other processing that needs to be done before is returned to the user.
You also need to make sure that you are applying the limits in your DB query otherwise you will be loading all the objects into memory which is not efficient.
2nd way
If you fetch 1,000,000 rows at once the user will have to wait until the response comes back which can result in a bad user experience also as the number of rows returned keeps growing the response time will keep increasing and you can hit a time-out eventually, also consider that you will be loading all those objects into memory in your server before is returned.
The only use case I see for this approach is if you have a list that doesn't grow over time or that you have a set number of items that doesn't affect response time.
Adding a Google Map plugin to our homepage, which updates a single marker dynamically whenever there is a new product search on our site (which we read from our database). So, "has there been a new search via our site? If yes, reposition the marker based on the new search's coords".
Currently every "n" seconds (haven't settled on a seconds value yet) an Ajax call is made (using SetInterval) to determine if there has been a new search, and if there has it returns a small JSON response. The script run via the Ajax call is a PHP script, which queries the database for the last row in our searches table (order by desc limit 1).
So, my question is (not being a sysadmin), could this setup put an undesirable strain on our server? Should i incorporate a timeout session, or something, which turns off the Ajax call after 100 goes, or after 15 mins (i mean, who sits for 15 mins looking at markers dynamically generate on a Google map?!).
Our homepage only receives roughly 200 visits a day.
As you have you given the statistics that your website gets 200 visits per day and that your server is a spitting JSON that you have to extract and display it on the UI, It is a normal practice to have a set up like this one. You can rather ping the server data using AJAX in every 5 sec to get more precise data but it wont cause any performance issue at this level.
Please be sure that you dont have servers that are separated geographically else you have to use some other synchronization mechanism to track users location based on there search.
For AJAX JQuery implementation details please see the following page.
For project implementation as a tutorial please visit this tutorial.
I'm developing a web application that is entirely in one page and is based on displaying a bunch of table data in grids. There are about 30 different tables in the database, any one of which can be requested by the user to display in a grid on their screen at any time. Most, but not all, of these tables have fewer than 1000 rows. The data for these grids is being called in and loaded via ajax.
Right now, I display the log in screen and immediately preload a few of the main tables, so as the user is typing in their user name and password, it's loading the initial grids. This has really increased user experience since they don't have to wait for an ajax call after clicking to see one of these datagrids. As a result, I'm thinking of taking it a step farther.
I am considering making ajax calls to load all 30 tables in the background. They won't be added to the dom until needed, but instead into an array. The negative is I don't know if the user will use half these tables in their session, but the ones they do will normally show immediately upon user request and create a better user experience.
So, my questions are, is it a good idea to store 30 full datatables (mostly about 50 to 1000 rows per table) in arrays via ajax calls, and if so, what's the best way to do it to get the best performance (keep in mind I am just putting them in arrays and not adding them to the dom after preloading)? Which of the following would be the best way:
Make 30 ajax calls for each table on page load
Make 1 ajax call on page load that returns all the tables
Make like 5 ajax calls on page load that each return like 6 tables
Skew the ajax calls so I make a few and then once they complete, make a few more
Some other method...
I'd suggest loading the main tables in one call, and then the remaining tables another call. The main issue is that if you batch anything together, none of that information will be available to the application until the entire AJAX request completes. So if you load everything in one call, that may take a while, and the main tables won't be ready when the user finishes logging in.
Are these tables being edited or will only be viewed?
If the answer above is only be viewed then, it may be necessary that you load all 30 tables (assuming they don't have any fields that contain BLOB data) and store them in the application's memory rather than session memory.
If the tables are going to be edit, then figure out which table is used by that user (group?) often and pre-load those.
Again, I think your initial milestone is good enough, I'm not sure I like the idea of loading a possible 30,000 rows of data in the session for the user. Especially since it has the potential of not being used....what happens if the data changes in the database, are you going to sync it? How would you know it changed? Are you going to poll the database? see the issues that start to arise?
I am working on an ajax application which will display about a million records in an html table. Web service returns records from server, I build a logn string by concatinating data and tags and than put this string using innerHTML (not using DOM for getting better performance).
For testing I have put 6000 recods in database (stored procedure takes about 4 seconds in completion of its execution).
While testing on local system (database and application on same machine) it took about 5 minutes to display the records in page. After deplying on web server it did not responde even for more time. It looks very low performance. I put records in a CSV file and its weight was less than 2 MB. I couldn't understand why string concatinations to build html table and putting string in innerHTML is taking such a huge time (if it is the issue). Requiment is to show about million records in web page but performance on just 6000 records is disappointing. I am not gettign what to do to increase performance.
Kindly guide me and help me.
You're trying to display a million records on a single page? No matter how you optimize your server code, that's a LOT of html to parse/render, especially if it's in a table.
Even using .innerHTML isn't going to "save" you any time. The rendering engine is still going to have to parse/style/render/position many millions of table rows/cells and you WILL have to wait while it's working.
If you absolutely HAVE to show all those records on a single page, try to break things up into manageable chunks. Have the AJAX call return (say) 100 records at a time, put those into the table, then fetch another 100 records, etc... At least that way you'll see the content of the page growing, rather than having to sit there and wait for 1,000,000 table rows to get displayed in a single shot.
A better option would be to do pageination, where only 100 records are shown at a time and you present a standard navigation with << first / prev / next / last >> buttons to swap through "pages" of data.
As Marc stated, you need pagination. See if this helps - How do I do pagination in ASP.NET MVC?
In addition to this you could optimize the result by employing master-detail pattern - fetch only the summary of the record (master) and on some action in master, fetch details and display on the screen. This will reduce the size of data being transfered from the server.