I'm developing a web application that is entirely in one page and is based on displaying a bunch of table data in grids. There are about 30 different tables in the database, any one of which can be requested by the user to display in a grid on their screen at any time. Most, but not all, of these tables have fewer than 1000 rows. The data for these grids is being called in and loaded via ajax.
Right now, I display the log in screen and immediately preload a few of the main tables, so as the user is typing in their user name and password, it's loading the initial grids. This has really increased user experience since they don't have to wait for an ajax call after clicking to see one of these datagrids. As a result, I'm thinking of taking it a step farther.
I am considering making ajax calls to load all 30 tables in the background. They won't be added to the dom until needed, but instead into an array. The negative is I don't know if the user will use half these tables in their session, but the ones they do will normally show immediately upon user request and create a better user experience.
So, my questions are, is it a good idea to store 30 full datatables (mostly about 50 to 1000 rows per table) in arrays via ajax calls, and if so, what's the best way to do it to get the best performance (keep in mind I am just putting them in arrays and not adding them to the dom after preloading)? Which of the following would be the best way:
Make 30 ajax calls for each table on page load
Make 1 ajax call on page load that returns all the tables
Make like 5 ajax calls on page load that each return like 6 tables
Skew the ajax calls so I make a few and then once they complete, make a few more
Some other method...
I'd suggest loading the main tables in one call, and then the remaining tables another call. The main issue is that if you batch anything together, none of that information will be available to the application until the entire AJAX request completes. So if you load everything in one call, that may take a while, and the main tables won't be ready when the user finishes logging in.
Are these tables being edited or will only be viewed?
If the answer above is only be viewed then, it may be necessary that you load all 30 tables (assuming they don't have any fields that contain BLOB data) and store them in the application's memory rather than session memory.
If the tables are going to be edit, then figure out which table is used by that user (group?) often and pre-load those.
Again, I think your initial milestone is good enough, I'm not sure I like the idea of loading a possible 30,000 rows of data in the session for the user. Especially since it has the potential of not being used....what happens if the data changes in the database, are you going to sync it? How would you know it changed? Are you going to poll the database? see the issues that start to arise?
Related
I've been doing some research about infinite scrolling and came across what people call "Lazy Loading". Now I have done it on one of my website elements (chatbox), and now I have 2 ways to do it. but I can't decide which one is more efficient than the other. Here are my ways:
Let's say I have 1,000,000 rows of data from database which I need to fetch all
1st way:
Load content from database, truncate it on the server-side code(PHP) then only show the first 50.
Upon user scroll on the page, another request will be sent to fetch the results again and display the next 50 and so on and so forth..
2nd way:
Load content from database, render it in my HTML as hidden elements but only displaying the first 50, then upon user scroll, show 50 more hidden elements.
The 1st way is requesting from the server whenever the need to display more results arises.
The 2nd way just does 1 request from the server, then hides the result except for the first few that should be shown.
I have no problem doing either of the two.
Now the dilemma, is that the 1st way is sending more HTTP requests, and the 2nd way (though sending only 1 HTTP request) is fetching huge data in a single request, which can be slow.
Which method is "better", and if there are other options, please let me know.
I know this is old question but want to give my opinion:
1st way
Is always preferred specially with that amount of rows, it is much more efficient to only request a set number of rows and if user wants to see more e.g click in next page another request is made to the server to fetch the next page, the response time will be much better, it will also make it easier to the client to manipulate the list if there is other processing that needs to be done before is returned to the user.
You also need to make sure that you are applying the limits in your DB query otherwise you will be loading all the objects into memory which is not efficient.
2nd way
If you fetch 1,000,000 rows at once the user will have to wait until the response comes back which can result in a bad user experience also as the number of rows returned keeps growing the response time will keep increasing and you can hit a time-out eventually, also consider that you will be loading all those objects into memory in your server before is returned.
The only use case I see for this approach is if you have a list that doesn't grow over time or that you have a set number of items that doesn't affect response time.
This is a performance related question .
I have a requirement to show all the orders made by the Customer during that Day (Max Orders can be of 10)
And the screen looks this way ??
Right now on click of the Order Row i am making a Ajax call , getting the data from the server and showing it on Front End .
And the end result looks this way
I am thinking of other approach , which is during page start (document ready ) up load all data related to that customer for that day , store it in a variable in a javascript (global level array).
and during click of the order row show the data by looping the array ??
Could anybody please tell me what is the best approach ??
If you know that everyone opening that page will go ahead and toggle all the rows then go ahead and preload everything. Otherwise it is much better to load only the data you need, thus make small ajax calls when the user requests data for a specific row.
The answer will depend on the specifics of your application, and how it is used.
How expensive is it to obtain the full list of orders? How often does a user need to see all the orders? Will your users tolerate short pauses while retrieving data from the server, or are they more likely to complain about page load time?
Neither approach is always better or worse, it just depends.
Our website provides various data services to our clients; one of which is gauge data. Some gauges log information every 15 minutes, some every minute. This data is sent to our SQL database.
All of this data is displayed via a graph (generated server side via PHP and JPGraphs) with each individual log entry being displayed as a row in a collapsible table (jquery 1.10.2).
When a client wants to view the data, they select a date range and which gauges they would like to view. If they want to view the last 3 days of a gauge that logs every minute then it loads pretty quickly. If they want to view 2 of those then it takes around 15-30 seconds to load. The real problem comes when they want to view a months worth of data; especially more than 1 gauge. This can take upwards of 15-20 minutes to load and the browser repeatedly asks if we want to stop the script from populating the collapsible table rows(jquery).
Obviously this is a problem since clients want a relatively fast response (1-5 min max). Ideally, we would also like to be able to pull gauge data from several months at a time. The only way we can do that now is to pull data 2 weeks at a time and compile the total manually.
For reference: If I wanted to pull a months data for 2 of our once-a-minute-logging gauges, then there would be 86,400 rows added via jQuery to a collapsible table. The page takes approx. 5 minutes to load and the browser is terribly slow during this time period.
My question is: What is the best way to pull/graph/populate data using a PHP based server (Symfony 1.4 framework) and javascript?
Should we look into upgrading our allotted processing power/RAM(we are hosted by GoDaddy)? Is there a faster way to populate collapsibles than with jquery? All of our calculatoins are done server side. Should we just pull the raw data and let the client side do the data processing? Should we split the data processing between client and server?
Here's a screen shot of the web page. Its cropped so that more client-sensitive information is not displayed:
In response to my comment.
Since you need the entire data-set only on the server side (you create your graph on the server), this means that you don't actually need to send the entire data-set to the client.
Instead send a small portion to the client. Let's say the first 200 results. Then you can go ahead and cache the rest of the result-set into a JSON file (lite database, whatever you want really). Then create an interface where the user can request for more data. Infinity scroll is nice but has its own problems. Maybe just a button that says load more data. As people have said anything more than a few hundred data points in a table at one time is crazy to have because people won't look at it anyways. Then when they hit the button to get more data, you send an AJAX request to the server with the correct parameters for what data you want.
For example the first time they click getMoreData() you want to get the next 200 data points. So you send getMoreData(start=200, length=200). Your server picks up the AJAX request and finds the correct data in the JSON file or the lite database, wherever you have cached the results. And the user can keep requesting more data (making sure you update your start parameter), and you only ever return a small subset. The user doesn't even realize that they don't have the whole data-set there in front of them because it looks like they do.
One that is complicated about this is sorting and searching. If you want to implement those then you need to make sure you go to the server side and sort/search through the cached results.
So basically you have a system where you can create the entire graph on the server side which shouldn't take long. What does take long is the loading of the entire data-set to the client side. So you break up that up into small chunks. You can even easily create pagination and the such with this method.
The problem im having is that I have a table that when a user click on a button it adds a row to the table via javascript, which works fine. The problem is that if a user need to update other data the user click another button which refreshes the page and all rows the user created in the table are deleted. My question is what can I do to make the rows not to be deleted once the page is refreshed? I know some might think, just not refresh the page, but there is to much data that has to be displayed and a new query has to be generated to grab the data. Any help would be very much appreciated.
Thanks for all the comments. I thought I would edit my question b/c ppl are asking why dont you add it via server or via ajax. The reason is b/c I ran into a problem doing it that way on another application and I will explain the problem. When I add a row via server-side it worked great, but what I started to noticed it that some users would add up to 100 rows and it would get very slow and even time out, b/c every time a user would add a row, it would have to re-create all those rows everytime which caused it to timeout. That why I wanted to add rows via javascript(client-side) b/c you dont have to re-create all those rows everytime a user adds another row. If there is another way of handling this without slowing down the page or potentially timing out the page, please let me know. Its kinda driving me crazy!!! I have been an ASP programmer for years and kind of newer to .Net and it seems like there is no way around this
Use html5 localstorage to keep the value of your rows on the client, on every refresh recreate the rows from localstorage using javascript.
As a side note, unless you either have a LOT of rows or a lot of data in the rows, server side shouldn't be particularly slow and either case will make localstorage unusable.
Alternative, serverside store the data in a session variable, but do not return it as part of your main page, use ajax to retrieve it seperately client side and put it in your page (with paging if a lot of rows).
You need to make the server aware that you've added a column to the table. There are many ways to do this.
Each time you post-back to the server (because HTTP is stateless) it begins from the start, so it generates a fresh table for you.
To do this you need to execute some code on the server, the easiest way is to add the row on the server and not on the client as you currently are.
If you post how your table is being generated - we'll be able to point you in the right direction.
2 realistic methods here:
1) Don't reload the page. If you do a get from a script you can simply return json or xml and parse it within the javascript
2) Post to the server when you are adding a new row such that it is saved and can be used later when refreshing the page
I am working on an ajax application which will display about a million records in an html table. Web service returns records from server, I build a logn string by concatinating data and tags and than put this string using innerHTML (not using DOM for getting better performance).
For testing I have put 6000 recods in database (stored procedure takes about 4 seconds in completion of its execution).
While testing on local system (database and application on same machine) it took about 5 minutes to display the records in page. After deplying on web server it did not responde even for more time. It looks very low performance. I put records in a CSV file and its weight was less than 2 MB. I couldn't understand why string concatinations to build html table and putting string in innerHTML is taking such a huge time (if it is the issue). Requiment is to show about million records in web page but performance on just 6000 records is disappointing. I am not gettign what to do to increase performance.
Kindly guide me and help me.
You're trying to display a million records on a single page? No matter how you optimize your server code, that's a LOT of html to parse/render, especially if it's in a table.
Even using .innerHTML isn't going to "save" you any time. The rendering engine is still going to have to parse/style/render/position many millions of table rows/cells and you WILL have to wait while it's working.
If you absolutely HAVE to show all those records on a single page, try to break things up into manageable chunks. Have the AJAX call return (say) 100 records at a time, put those into the table, then fetch another 100 records, etc... At least that way you'll see the content of the page growing, rather than having to sit there and wait for 1,000,000 table rows to get displayed in a single shot.
A better option would be to do pageination, where only 100 records are shown at a time and you present a standard navigation with << first / prev / next / last >> buttons to swap through "pages" of data.
As Marc stated, you need pagination. See if this helps - How do I do pagination in ASP.NET MVC?
In addition to this you could optimize the result by employing master-detail pattern - fetch only the summary of the record (master) and on some action in master, fetch details and display on the screen. This will reduce the size of data being transfered from the server.