Handling large data in Java Web Application - javascript

I have been assigned with a task to handle large number of data and show then to a webpage tabular form. I'm using HTML/JSP and JS for Frontend and Java as backend.
Business logic is to query database (for me it's Oracle) and get data.
Query looks something like
Select field1, field2 etc.. from table where field1 = "SearchString"
Limit 30
The search string will be given by user.
So, each time the query gets executed I'm getting 30 rows and storing it in a bean.
And with field2 data from iteration 1 I'll execute the query again which will give another 30 rows, I will append those in the bean and loop continues untill there is no matching records. After that I need to display the bean data in UI in tabular form.
Now problem arises when the data is huge. Like, the iteration goes on 1000 times giving 30k records. Then the code is getting stuck in this loop for more time and UI screen is showing loading.
Is there a better approach to my situation?
Note : I can't do any operation the query. Because it's forbidden.
And the query is pseudo query not actual. If the first record has matching record of 30k rows. I need to take 30 in each iteration.

I agree with the comments that this is not the best practice when you are trying to present thousands and thousands of rows to the UI...
It really sounds like you should implement pagination on your UI. This is done by using queries... I don't know what DB system you are using but here is a guide on pagination for SQL Server.
You can explain to the business that using pagination is better for the user. Use the example of how google search gives you pages of search results instead of showing you millions of websites of cat pictures all in one page.

Related

Show tables faster using ajax and jquery

I have thousands of records in Database and I'm giving user interface with just one Textbox where he will be putting date and records will be searched on basis of Date put in textbox. Now searching logic will be like this, ajax call will go to servlet and there all records will be fetched by firing query to database and will be put in Map and send Map back to ajax with JSON format.
Ajax will iterate over the records with each function and will match the records with date equality.Those who are Matched will be shown using .html(header+data)
Now How this whole process can be made faster or is there any another way of showing the tables with thousand records in faster way at just click of the button.I need help on this.Sorry for the BIG question.

Full Text Indexing After Modifying a Record

I have an application uses viewpanels to display data. One viewpanel displays unprocessed records and the other displays processed records. The user chooses an unprocessed record (using the show values in this column as links option), and is directed to a page where they input information. Then then click on button that updates the documents using doc.replaceItemValue statements in javascript. The user is then directed back to the viewpanel that displays the unprocessed records. In order to have the just processed record not show up in the unprocessed records I have to reindex the database. I am using database.updateFTIndex(false) to accomplish this.
Is there a better way to accomplish this? If two are more users are submitting records, will their individual indexes step on each other?
I never had to worry about this when using mysql.
Thanks for any advice.
I've used that technique for a while in production and not been notified of any issues. Updating an index via the Database Properties or a View gives the message that it has been queued for update on the server, but I'm not sure if the same happens with the programmatic call. It may well do.
In my scenario, I'm consolidating a lot of data into individual documents, so although intensive use periodically, it's not a huge number of documents being updated at any one time.
I'm also running the update to the index via sessionAsSigner, I had assumed that would be needed for authority purposes.

Way to access huge content from DB

I need to fetch huge data(may be some 10K records) from DB and show it as report(i use DataTable), and it has data filter/search and pagination.
Question - which one is best/recommended way from the below option,
I will fetch all the records at once and store it in front end(as a object) and if filter applies i will filter from the object and display it.
Likewise i wont interact with DB if i work with pagination(Since i have all the records with myself already)
Every time i need to contact the DB when i applies filter/search.
Likewise for pagination,
For example, if i select page 5 then i will send a query to DB to get me only those data and display it. Note: Number of record per page is also the option to select.
If we have any other best way, please guide me.
Thanks,
I am not familiar with DataTable, but it appears to be similar to jqGrid, which I'm familiar with.
I prefer your proposed solution #2. You are better off fetching only what you need. If you're only displaying, say, 100 rows, it's wasteful (both in terms of bandwidth and local memory usage) to fetch 10k rows at once if you're only displaying 100.
Use LIMIT on the MySQL side to fetch only the records you need. If you want, say, records 200 through 300 for page 3, you'd add LIMIT 200, 100 to the end of your query (the first parameter to LIMIT says "start at 200" and the second says "fetch 100 rows.") If DataTable works like jqGrid, you should be able to re-query the database and repopulate your table when the user changes pages, and this fetch will be done in the background with AJAX, which conserves bandwidth. Your query will be identical except for the range specified by the LIMIT at the end of your query.
Think of it this way: say you use GMail and you never archive your messages, so your inbox contains 20,000 emails, but only shows 100 per page. Do you think Google has designed the GMail front-end so that all 20k subject and from lines are fetched at once and stored locally, or is the server queried again when the user changes pages? (It's the latter.)

ajax performance on loading massive list from backend

What I did for the CRUD in my app is that I select all the item from backend and load it to the front-end and loop the item out using js, to be specified I used ajax.
Think of my app is a todo list. Even if a user inserted a new item, I suppose still need to select all the items from db again after insert query right? same goes to delete, I may use remove() but still need to load so that my item id doesn't mess up. correct?
I using angularjs ng-repeat, I cant do like id++, then I bind the the id in ng-repeat with the object that I got from json form db.
if I have 1 thousand of item that will cause problem because I trigger the load function too much in backend, how to solve that?
Loading all the items from back end is invitation for disaster. It will kill back end and front end both. It becomes a serious usability problem if you dump 1000's of rows of data in the UI. How will the user wade through the data and act on them? Provide some way to filter the items. For example - if it is a todo list display one day at a time (default being today). For any other use case we can provide similar filtering mechanism. That way you query limited data from back end, take it to the UI and display it. If you cannot filter like this at least provide some way of pagination to limit the data you query and transport to the UI.

Ajax issue: delay in getting data from web service using innerHTML, please guide

I am working on an ajax application which will display about a million records in an html table. Web service returns records from server, I build a logn string by concatinating data and tags and than put this string using innerHTML (not using DOM for getting better performance).
For testing I have put 6000 recods in database (stored procedure takes about 4 seconds in completion of its execution).
While testing on local system (database and application on same machine) it took about 5 minutes to display the records in page. After deplying on web server it did not responde even for more time. It looks very low performance. I put records in a CSV file and its weight was less than 2 MB. I couldn't understand why string concatinations to build html table and putting string in innerHTML is taking such a huge time (if it is the issue). Requiment is to show about million records in web page but performance on just 6000 records is disappointing. I am not gettign what to do to increase performance.
Kindly guide me and help me.
You're trying to display a million records on a single page? No matter how you optimize your server code, that's a LOT of html to parse/render, especially if it's in a table.
Even using .innerHTML isn't going to "save" you any time. The rendering engine is still going to have to parse/style/render/position many millions of table rows/cells and you WILL have to wait while it's working.
If you absolutely HAVE to show all those records on a single page, try to break things up into manageable chunks. Have the AJAX call return (say) 100 records at a time, put those into the table, then fetch another 100 records, etc... At least that way you'll see the content of the page growing, rather than having to sit there and wait for 1,000,000 table rows to get displayed in a single shot.
A better option would be to do pageination, where only 100 records are shown at a time and you present a standard navigation with << first / prev / next / last >> buttons to swap through "pages" of data.
As Marc stated, you need pagination. See if this helps - How do I do pagination in ASP.NET MVC?
In addition to this you could optimize the result by employing master-detail pattern - fetch only the summary of the record (master) and on some action in master, fetch details and display on the screen. This will reduce the size of data being transfered from the server.

Categories

Resources