Server-side lazy loading vs. Client-side lazy loading - javascript

I've been doing some research about infinite scrolling and came across what people call "Lazy Loading". Now I have done it on one of my website elements (chatbox), and now I have 2 ways to do it. but I can't decide which one is more efficient than the other. Here are my ways:
Let's say I have 1,000,000 rows of data from database which I need to fetch all
1st way:
Load content from database, truncate it on the server-side code(PHP) then only show the first 50.
Upon user scroll on the page, another request will be sent to fetch the results again and display the next 50 and so on and so forth..
2nd way:
Load content from database, render it in my HTML as hidden elements but only displaying the first 50, then upon user scroll, show 50 more hidden elements.
The 1st way is requesting from the server whenever the need to display more results arises.
The 2nd way just does 1 request from the server, then hides the result except for the first few that should be shown.
I have no problem doing either of the two.
Now the dilemma, is that the 1st way is sending more HTTP requests, and the 2nd way (though sending only 1 HTTP request) is fetching huge data in a single request, which can be slow.
Which method is "better", and if there are other options, please let me know.

I know this is old question but want to give my opinion:
1st way
Is always preferred specially with that amount of rows, it is much more efficient to only request a set number of rows and if user wants to see more e.g click in next page another request is made to the server to fetch the next page, the response time will be much better, it will also make it easier to the client to manipulate the list if there is other processing that needs to be done before is returned to the user.
You also need to make sure that you are applying the limits in your DB query otherwise you will be loading all the objects into memory which is not efficient.
2nd way
If you fetch 1,000,000 rows at once the user will have to wait until the response comes back which can result in a bad user experience also as the number of rows returned keeps growing the response time will keep increasing and you can hit a time-out eventually, also consider that you will be loading all those objects into memory in your server before is returned.
The only use case I see for this approach is if you have a list that doesn't grow over time or that you have a set number of items that doesn't affect response time.

Related

How can I avoid getting the same posts twice

I'm writing the code for a web application using PHP, MySQL, and javascript.
It's a very simple social network where users can create posts and see, like and comment other users posts. On the main page it loads posts and orders them based on an index based on the number of likes, comments and when the post was created.
Since I can't load every post at once (because ideally there can be millions of them), I load the top N posts and then when I scroll down it loads more posts (with an Ajax request) and adds them at the bottom of the page. The problem is that since the posts are ordered dynamically, if I just limit the number of posts and then offset them in the later requests, I sometimes get the same post twice, and some posts never gets shown. How can I solve this?
Right now it just checks with javascript the id of every new post and it just discards the ones that are already on the page (checking the id of the posts on the page), but I dont like it because every time it loads more posts it has to check if every single post is already on the page or not and if the number of post grows it will get very slow.
If you have a lot of processing power in the server and you're able to do the scoring for any time in the past (e.g. taking into account only likes previous this time), you can make your page send: (1) the timestamp of the first loading, and (2) the number of post already there (or, better, the score of the worst post it received, because posts could have been deleted since then).
Let the server compute the list of the past and send the next posts.
If you have more memory in the server than processing power, then you could just save a copy of the scored list for some time (1h, 24h…) whenever someone request it.
And if you want the server to send the new best post of the moment: if you know the present listing and you know what the user already got (with their timestamp and last score they got), you can make the server removes what they already have and gives them the rest.

How to refresh screen data in JavaScript without making the page slow?

I have a question in terms of code and NOT user experience, I have the JS:
$(document).on( "click", "input:radio,input:checkbox", function() {
getContent($(this).parent(),0);
});
The above JS gets the contents from radios and checkboxes, and it refreshes the page to show dependencies. For example if I check on yes, and the dependency is on yes, show text box, the above works!
What I want to know is, if there is a better way to do the same thing, but in a more friendly way, as this is at times, making the pages slow. Especially if I do a lot of ticks/checks in one go, I miss a few, as the parent refreshes!
If you have to hit your server to getContent() then it will automatically be slow.
However, you can save a lot if you send all the elements once instead of hitting the server each time a change is made.
Yet, if creating one super large page is not an option, then you need to keep your getContent() function, but there is one possible solution, in case you did not already implement such, which is to cache all the data that you queried earlier.
So you could have an object (a map) which has keys defining the data you're interested in. If the key is defined, then the data is already available and your return and use that data directly from the cache. Otherwise, you have to hit the server.
One thing to do, you mentioned slowness as you 'tick' things back and forth, is to not send more than one request at a time to the server (with a timeout in case the server never replies). So the process here is:
Need data 'xyz'
Is that data already cached? if yes, then skip step (3 and 4)
If a request being worked on? if yes, push the data on the request stack and return
Send a request to the server, which blocks any further request until answer for 'xyz' is received
Receive the answer and cache the data in an object (map) and release the request queue
Make use of data as required
I check the request queue, if not empty pop the next request and start processing from (2)
The request process is expected to be run on a timer because (1) it can time out and (2) it need to run in the background (not GUI preemptive)

Make More Server calls OR load all Data from server at once?

This is a performance related question .
I have a requirement to show all the orders made by the Customer during that Day (Max Orders can be of 10)
And the screen looks this way ??
Right now on click of the Order Row i am making a Ajax call , getting the data from the server and showing it on Front End .
And the end result looks this way
I am thinking of other approach , which is during page start (document ready ) up load all data related to that customer for that day , store it in a variable in a javascript (global level array).
and during click of the order row show the data by looping the array ??
Could anybody please tell me what is the best approach ??
If you know that everyone opening that page will go ahead and toggle all the rows then go ahead and preload everything. Otherwise it is much better to load only the data you need, thus make small ajax calls when the user requests data for a specific row.
The answer will depend on the specifics of your application, and how it is used.
How expensive is it to obtain the full list of orders? How often does a user need to see all the orders? Will your users tolerate short pauses while retrieving data from the server, or are they more likely to complain about page load time?
Neither approach is always better or worse, it just depends.

Preloading multiple large datasets in the background via ajax?

I'm developing a web application that is entirely in one page and is based on displaying a bunch of table data in grids. There are about 30 different tables in the database, any one of which can be requested by the user to display in a grid on their screen at any time. Most, but not all, of these tables have fewer than 1000 rows. The data for these grids is being called in and loaded via ajax.
Right now, I display the log in screen and immediately preload a few of the main tables, so as the user is typing in their user name and password, it's loading the initial grids. This has really increased user experience since they don't have to wait for an ajax call after clicking to see one of these datagrids. As a result, I'm thinking of taking it a step farther.
I am considering making ajax calls to load all 30 tables in the background. They won't be added to the dom until needed, but instead into an array. The negative is I don't know if the user will use half these tables in their session, but the ones they do will normally show immediately upon user request and create a better user experience.
So, my questions are, is it a good idea to store 30 full datatables (mostly about 50 to 1000 rows per table) in arrays via ajax calls, and if so, what's the best way to do it to get the best performance (keep in mind I am just putting them in arrays and not adding them to the dom after preloading)? Which of the following would be the best way:
Make 30 ajax calls for each table on page load
Make 1 ajax call on page load that returns all the tables
Make like 5 ajax calls on page load that each return like 6 tables
Skew the ajax calls so I make a few and then once they complete, make a few more
Some other method...
I'd suggest loading the main tables in one call, and then the remaining tables another call. The main issue is that if you batch anything together, none of that information will be available to the application until the entire AJAX request completes. So if you load everything in one call, that may take a while, and the main tables won't be ready when the user finishes logging in.
Are these tables being edited or will only be viewed?
If the answer above is only be viewed then, it may be necessary that you load all 30 tables (assuming they don't have any fields that contain BLOB data) and store them in the application's memory rather than session memory.
If the tables are going to be edit, then figure out which table is used by that user (group?) often and pre-load those.
Again, I think your initial milestone is good enough, I'm not sure I like the idea of loading a possible 30,000 rows of data in the session for the user. Especially since it has the potential of not being used....what happens if the data changes in the database, are you going to sync it? How would you know it changed? Are you going to poll the database? see the issues that start to arise?

Multiple AJAX Requests - Troublesome?

Right now I'm using AJAX to pull in a list of active streams (TwitchTV) and it's viewers and I'm requesting this every second. At time the list of streams to check can get quite lengthy so I plan on splitting the ajax requests into 2 or 3 parts:
1) Get Number of Viewers for Current Stream (Check every 1 Second)
2) Split Stream in Half and Check 1st Half of List for Active Streamers (Check every 5 Seconds)
3) Check 2nd Half of List for Active Streamers (Check every 5 Seconds)
so I would have 3 requests running simultaneously but I'm worried about what the load time will come down to. Since it is constantly pulling in data would it make the page slower? Would the user likely notice? Is it better to keep 1 ajax request for big amounts of data or is it better to use multiple ajax requests for smaller pieces of data? Is ajax really the best thing to pull in constantly changing live data?
The answer to your various questions is probably "It depends":
The ajax requests by themselves shouldn't make anything slower. These are asynchronous requests, so they will only actually cause the user's browser any significant (and probably still not noticeable) load when the request completes.
One thing that could potentially slow your app down (or cause the user to notice in an unpleasant way) is the DOM manipulation when the request completes. Changing your current number of streaming users in-place probably won't hurt, but depending on the number of streams/how you are displaying them in a list, redrawing this could potentially be very expensive/cause lag on things like redraw.
An alternative to using Ajax (depending on what browsers you wish to support) is to use websockets. This way you can keep a connection open and the server can tell the application when the data needs to change, instead of the need to poll for it.
Why do you need to break your list up into a first half and a second half?
One way to cut down on the amount of data you're sending back and forth might be to send some sort of signal indicating the last bit of data you received. For example, when your timeline on twitter.com updates every few seconds, the ajax request sends along the id of the most recent tweet it received, so that the server knows not to waste time sending any data older than that. Depending on your use case this might be effective.

Categories

Resources