search autocomplete performance with many records - javascript

I'm developing an application that is essentially a search bar. The source is a SQL table that has about 300,000 records.
Ideally, I would like to have some sort of autocomplete feature attached to this search bar. I've been looking into several ones like jquery autocomplete.
However, as one can imagine, loading all of these records as the source for the autocomplete is impossible. The performance would be abysmal.
So my question is, what is an efficient way to implement a search autocomplete feature for a source that contains thousands and thousands of records?
I thought of something like this. Essentially I'm querying the database each time they type something to get a list of results. However, querying a database via ajax doesn't seem optimal.
$( "#search" ).keyup(function( event ) {
$.ajax({
//query the database when the user begins typing, get first 1000 records
//set the source of the autocomplete control to the returned result set
});
});

You should not start querying the db on very first keyup, (not even in three-four) keyup.
For example, User is typing Albatross. When He hit 'A', if you do a Query search it will send you almost 300,000 results right away, cause every set of data must have the letter "A".
So, should ignore first few (3-5) letters. it will be better, if you can store the search keywords. Cache the top results, when 1-3 keyup you show the top search keywords. Auto complete might not be good feature for searching in a that big DB,
Last Tips for the problem, Your users use google and facebook everyday. They are more then 300,000 result for each of search in any of the applications above. Google or facebook does not show 1000 result at once. It is not good for UI Design or your Servers bandwidth. Just think, how can you categorizes and present the data to user, so that they get what they want and you keep your servers bandwidth and processing cost optimal.
always, remember the context.

Do not bind any events yourself. jQuery Autocomplete already performs bindings.
The proper way to implement this is to set the source: object to a an AJAX callback:
source: function (request, response) {
$.ajax({
url: 'yourQueryUrl.php', // <- this script/URL should limit the number of records returned
async: true,
cache: false,
type: 'GET',
data: {q: $('#searchBox').val()},
dataType: 'json',
success: function (data) {
response(data);
}
});
}

I am assuming you have added indexes to your table, if not that would be your first step, then if performance is insufficient and if your queries often repeat, you might want to look at this.
http://memcached.org/
or some other caching mechanism.
Upon request of some key you would return that key and add it to the cache, opon subsequent request for same key data would be read from cache instead of hitting database. That would reduce the load and increase the speed.

source: function (request, response) {
$.ajax({
url: 'yourQueryUrl.php', // <- this script/URL should limit the number of records returned
async: true,
cache: false,
type: 'GET',
data: {q: $('#searchBox').val()},
dataType: 'json',
success: function (data) {
response(data);
}
});

Related

Can we split big ajax calls into multiple smaller calls to load data faster?

I used below ajax call to retrieve data from database and show it in my page.
$.ajax({
type: "POST", url: "MyPage.aspx/LoadGrid",
data: "{idyear:'2020'}",
contentType: "application/json; charset=utf-8",
dataType: "json",
success: function (response) {
$(".gridBody").html(response.d);
},
failure: function (response) {
alert(response.d);
}
});
Currently this operation returns 1026 records and takes aroud 12 seconds.
since this process is time consuming and records will be more in future I have to find an alternative way to load data faster.
So, I tried another approch. I decided to get total number of records at first. for example now i have 1026 records and if I want to load my data in 100 records boundles, I need to make 11 ajax calls simultanously and combine the results at then end of all ajax calls.
I thought by applying this method I can start all calls together and I don't have to wait for ending a call to start a new one.
var pagesize = 100;
getQty(function () {
var pagesqty = Math.floor(qty / pagesize);
if (qty % pagesize > 0) {
pagesqty += 1;
}
var control = 0;
for (var i = 0; i < pagesqty; i++) {
$.ajax({
type: "POST", url: "MyPage.aspx/LoadGridMtoN",
data: "{idyear:'2020' , page:'" + i + "' , pagesize:'" + pagesize + "'}",
contentType: "application/json; charset=utf-8",
dataType: "json",
success: function (response) {
//alert(control+" succeed");
eval("var str" + i + "='" + response.d + "'");
control += 1;
if (control == pagesqty) {
var str = "";
for (var i = 0; i < pagesqty; i++) {
eval("str += str" + i);
}
$(".gridBody").html(str);
}
},
failure: function (response) {
alert(response.d);
}
});
}
});
but now I am getting time out error while executing ajax calls.
does any one knows any bettere way?
P.S: I wanted to try web worker, but It seems that I can't use JQuery in web Workwer.
P.S: I dont want to use paging. I should load all data together.
Please note that simultaneously calling endpoints from client side (instead of a single call) has more overhead and is not a good solution. For example, your server should handle more connections from client, your DB should handle more connections from your ORM in back-end, your table and correspondingly your disk and IO is challenging etc...
By the way, by considering that all the parts of your system are perfectly designed, from UX point of view, and from technical point of view, incrementally loading the data is a good solution. I mean, loading the first 100 records, while user are using them loading the second 100 records and adding them to end of the list (or when the user scroll down).
However finally you have to consider pagination! You can't load 1M records on your webpage! And no one check and use all the records in the webpage. So, you had to limit the number of records fetched and use server side pagination or provide another approach to the users, to submit their request, and then you process the request and create the report and write it in a raw text file or an excel file and email him/her the download link.
The answer you don't want to hear is that you need to fix your server-side. 1026 records really should not be taking 12 seconds, especially for data being retrieved for the client. Consider reducing the number of JOINs in your SQL statement and aggregating the data in server-side logic instead, and try running EXPLAIN against your queries and consider indices were appropriate.
To answer your question about splitting AJAX calls...
It looks like you have implemented pagination, so perhaps create an asynchronous recursive function that obtains 5-10 records at a time, incrementing the pageNum and recursing after each promise response. You can populate the page and the user will be seeing data without waiting so long. However, you must understand that this would increase the volume to your server, and it will probably end up taking longer to obtain all of the data.
I feel the way you are trying to accomplish this goal to be in bad practice.
Assuming you can make changes to the server side,
create a new table with all the fields that you are going to need on the front end
write a stored procedure to update this table on regular basis
use this table in your ajax call to fetch the records.
Use pagination. No one is going to use 1000+ records at a time
give a search option at the top, in case you feel like the user must have access to all the records.
As suggested in other answers, don't create multiple ajax calls. You will only end up regretting and creating a bottleneck for yourself in later stages

Datatables colReorder saving WITHOUT statesave

I am using datatables and am using colReorder and need to save the state of the columns without using statesave(I am not allowed to use localcache). I do however have a preference table in my database for storing this kind of information in JSON format.
I have looked at colReorder.order() which looks like what I need to get the order.
What I'm thinking so far is on a column change, call colReorder.order() and place that returned array in my preferences table and then on re-initialization use that to re-order the table.
So my question/what I need help on is this: On a change of the colOrder, I need to save the order they're in and update my preferences. How do I do this? I can't seem to find "where" to place the colReorder.order(). I haven't seen an onChange() for datatables or even sure if that would be the best way to approach this
EDIT: David's answer is the ideal solution, however not applicable in my situation due to code already existing and laziness.
My solution/work-around that I found was to stringify and save details.mapping from within this function to my preferences and on initialization of my table I use colReorder.order(savedArray[],true).
Leaving it in case anyone finds themselves in the situation I was in.
Actually DataTables provide methods for storing and retrieving state to and from an alternative location. Look at stateSaveCallback and stateLoadCallback.
I do however have a preference table in my database for storing this
kind of information in JSON format
Then you just need to fill out the "blanks". Lets assume you have a serverside script called statesave that can store and retrieve the state by 'set' and 'get' using an unique userId. The skeleton would look like this:
$('#example').DataTable({
stateSave: true,
stateSaveCallback: function(settings, data) {
$.ajax( {
url: 'statesave',
dataType: 'json',
data: {
action: 'set',
userId: aUserId,
state: data
}
})
},
stateLoadCallback: function(settings) {
var state;
$.ajax( {
url: 'statesave',
dataType: 'json',
async: false,
data: {
action: 'get',
userId: aUserId
},
success: function(data) {
state = data
})
})
return state
}
})
It sounds like you are using the server-side processing option. If that's the case, you can add the column reorder array to the sent parameters object and save it that way.

What do these AJAX Parameters do with Wikipedia's API?

I'm analyzing this CodePen's code, which lets the user search any item within Wikipedia (through Wikipedia's API), and the search engine shows the first 10 results and brief summaries. Analyzing other people's code is (IMO) one of my best ways to learn, along with reading guidebooks and finishing tutorials.
The AJAX code I couldn't understand, is this:
$.ajax({
url: "https://en.wikipedia.org/w/api.php",
jsonp: "callback",
dataType: 'jsonp',
data: {
action: "query",
list: "prefixsearch",
pssearch: $(".searchbox").val(),
pslimit: "10",
format: "json"
},
xhrFields: {
withCredentials: true
},
success: updateSuggest,
error: function(err) {
console.log(err);
}
});
Idont understand what these 4 data parameters (action, list, pssearch, pslimit) do. What exactly are these 4 parameters' functions... can someone explain them? For example, what does pssearch and list and pslimit do?
I tried looking these terms up on the API, jQuery website and Google searches, but no avail.
These properties are for searching the title prefix:
action: "query": Queries for data action.
list: "prefixsearch": "Perform a prefix search for page titles." (docs, prefixsearch)
pssearch: The search string. - (docs)
pslimit: Limit the number of entries to returned. - (docs)
The Prefixsearch has a short explanation of most of these parameters.
They create the url query parameters that will end up looking like:
https://en.wikipedia.org/w/api.php?action=query&list=prefixsearch.....&format=json
when the actual request is made. The api documentation will provide the specifics for each option

JQuery Autocomplete - multiple data post not working

$("#textjawatan" + noid).autocomplete({
source: function( request, response ) {
$.ajax({
url: "pendaftar_table.php",
dataType: "json",
data: { term : "pe" } ,
success: function( data ) {
response( data );
}
});
},
minLength: 2,
select: function(event, ui) {
$("input#jawatan" + noid).val(ui.item.no);
}
});
when i use
data: { term : "pe" }
its work, but when i use
data: { term : "pe", id : "jawatan" }
its doesnt work, what is the problem ?
I actually had this problem not too long ago. Let me explain in quick words.
As jQuery Autocomplete explain in their site http://api.jqueryui.com/autocomplete/ jQuery Autocomplete only accepts TERM as output for server. Let me quote the page
Function: The third variation, a callback, provides the most
flexibility and can be used to connect any data source to
Autocomplete. The callback gets two arguments: A request object, with
a single term property, which refers to the value currently in the
text input. For example, if the user enters "new yo" in a city field,
the Autocomplete term will equal "new yo". A response callback, which
expects a single argument: the data to suggest to the user. This data
should be filtered based on the provided term, and can be in any of
the formats described above for simple local data. It's important when
providing a custom source callback to handle errors during the
request. You must always call the response callback even if you
encounter an error. This ensures that the widget always has the
correct state.
In other words, if you even add 250 extra hashes to the json dict, it will only work TERM to server side.
How did I fix this?
What I did was this. Based on jQuery docs, the source can also be an ARRAY, so i did an ajax call to my server BEFORE setting the jQuery autocomplete and then feed the autocomplete plugin.
Note: This is not a very good fix, and I'm aware of it. But I had to do it due to my work's details and is just an option.

Using multiple ajax calls in one javascript function

Thanks in advance for any help.
Is it bad practice and/or inefficient to use multiple $.ajax calls in one javascript function? I've been working on one and have a simple testing environment set up on my computer (apache server with php/mysql), and I've noticed that the server will crash (and restart) if I have multiple ajax calls.
I have two ajax calls currently: one passes 4 pieces of data to the php file and returns about 3 lines of code (pulling info from my sql database), the other simply gets the total rows from the table I'm working with and assigns that number to a javascript variable.
Is it just that my basic testing setup is too weak, or am I doing something wrong? See below for the two ajax calls I'm using:
$.ajax({
type: "GET",
url: "myURLhere.php",
cache: false,
data: {img : imageNumber, cap : imageNumber, width : dWidth, height : dHeight},
})
.done(function(htmlpic) {
$("#leftOne").html(htmlpic);
});
$.ajax({
type: "GET",
url: "myotherURLhere.php",
cache: false,
success: function(data) {
lastImage = data
}
})
Short answer: two ajax request on a page is absolutely fine.
Longer answer:
You have to find the balance between minimalising the number of ajax calls to the backend reducing the traffic and communication overhead; and still maintaining a maintainable architecture (so do not pass dozens of parameters in one call to retrieve everything - maybe only if you do it in a well designed way to collect every parameter to send)
Also most likely there's something wrong with your backend setup, try looking into webserver logs

Categories

Resources