JSON vs multiple connections - javascript

i'm currently building a website which searches an external database and brings up records which match the given search string. The search is live, so results are brought up as the user types.
now the first (and current) approach i took, is that the page actually connects to the mySQL server and retrieves content via AJAX, with EVERY letter the user types in the search box.
now i am starting to look at JSON objects (i only very recently started building websites), and was wondering if it would be a good idea, to load the entire database into a JSON object in the beginning and then look through that when searching.
is this a good idea? would it be faster? thanks in advance

It totally depends on the size of the data and the complexity of the query. If you can reasonably send the data to the client in advance and then search it locally, then sure, that's useful because it's all local and you don't have the latency of querying the server. But if you have a large amount of data, or the query is complex, it may well make more sense to do the query on the server.
There's no one-size-fits-all solution, it's data-dependent.
...and retrieves content via AJAX, with EVERY letter the user types in the search box.
That's usually overkill. Normally, you want to wait until there's a pause in the user's typing before firing off the ajax call, so that if they type "james" in rapid succession, you search for "james" rather than searching for "j", then "ja", then "jam", then "jame", and then "james".
For instance, let's say your search trigger is a keypress event. This would be a fairly common approach:
var keypressTimer = 0;
function handleKeypress() {
if (keypressTimer) {
cancelTimeout(keypressTimer);
}
keypressTimer = setTimeout(doSearch, 100); // 100ms = 1/10th of a second
}
function doSearch() {
var searchValue;
keypressTimer = 0;
searchValue = /*...get the search value...*/;
doAjaxCallUsing(searchValue);
}
This is called "debouncing" the input (from hardware engineering, related to the mechanical and electrical "bouncing" of a key as it's pressed).

Related

Attempting to use a global array inside of a JS file shared between 2 HTML files and failing

So I have one HTML page which consists of a bunch of form elements for the user to fill out. I push all the selections that the user makes into one global variable, allTheData[] inside my only Javascript file.
Then I have a 2nd HTML page which loads in after a user clicks a button. This HTML page is supposed to take some of the data inside the allTheData array and display it. I am calling the function to display allTheData by using:
window.onload = function () {
if (window.location.href.indexOf('Two') > -1) {
carousel();
}
}
function carousel() {
console.log("oh");
alert(allTheData.toString());
}
However, I am finding that nothing gets displayed in my 2nd HTML page and the allTheData array appears to be empty despite it getting it filled out previously in the 1st HTML page. I am pretty confident that I am correctly pushing data into the allTheData array because when I use alert(allTheData.toString()) while i'm still inside my 1st HTML page, all the data gets displayed.
I think there's something happening during my transition from the 1st to 2nd HTML page that causes the allTheData array to empty or something but I am not sure what it is. Please help a newbie out!
Web Storage: This sounds like a job for the window.sessionStorage object, which along with its cousin window.localStorage allows data-as-strings to be saved in the users browser for use across pages on the same domain.
However, keep in mind that they are both Cookie-like features and therefore their effectiveness depends on the user's Cookie preference for each domain.
A simple condition will determine if the web storage option is available, like so...
if (window.sessionStorage) {
// continue with app ...
} else {
// inform user about web storage
// and ask them to accept Cookies
// before reloading the page (or whatever)
}
Saving to and retrieving from web storage requires conversion to-and-from String data types, usually via JSON methods like so...
// save to...
var array = ['item0', 'item1', 2, 3, 'IV'];
sessionStorage.myApp = JSON.stringify(array);
// retrieve from...
var array = JSON.parse(sessionStorage.myApp);
There are more specific methods available than these. Further details and compatibility tables etc in Using the Web Storage API # MDN.
Hope that helps. :)

performance advice regarding multiple facebook Json requests

I'm developing a facebook app which searches for facebook events near your position.The only way to do so is to search for all the places id's in your zone and then for each of those check if there is an event today.The problem I have is that the computation takes like 1-1:30 min which is kinda long. This is the code I use(might not be the best, I know):
foreach (var item in allPlacesIds)
{
RunOnUiThread (() =>loading.Text = string.Format ("Loading {0} possible events out of {1}",count,allPlacesIds.Count));
string query = string.Format ("{0}?&fields=id,name,events.fields(id,name,description,start_time,attending_count,declined_count,maybe_count,noreply_count).since({1}).until({2})", item,dateNow,dateTomorrow);
JsonObject result=(JsonObject)fb.Get (query, null);
try
{
JsonArray allEvents= (JsonArray)((JsonObject) result ["events"])["data"];
foreach (var events in allEvents)
{
Events theEvent= new Events(((JsonObject)events) ["id"].ToString(),
((JsonObject)events) ["name"].ToString(),
((JsonObject)events) ["description"].ToString(),
((JsonObject)events) ["start_time"].ToString(),
int.Parse(((JsonObject)events) ["attending_count"].ToString()),
int.Parse(((JsonObject)events) ["declined_count"].ToString()),
int.Parse(((JsonObject)events) ["maybe_count"].ToString()),
int.Parse(((JsonObject)events) ["noreply_count"].ToString()));
todaysEvents.Add(theEvent);
}
}
catch(Exception ex)
{
}
count++;
}
Where the try starts I used to have an if but that made it take even longer so I replaced it with a try block, as the result comes as null.
I know this isn't exactly a technical issue but I felt maybe you guys know a faster and better implementation of this, my only other option is to create and host a web service and use that just to interrogate data. the problem with that is that I need to invest a lot of money into a server/real ip/ and then I need to create a scheduled job to update the data daily.
Each API call takes some time, the only way to make it faster is to use Batch Requests. Here´s the documentation about those: https://developers.facebook.com/docs/graph-api/making-multiple-requests
Keep in mind that this will not count as one API call, it´s still the same amount, so be careful with API limits.

search items that load data from DB in Knockout JS

In my application, I have an observableArray that loads data from DB. This observableArray fills first 25 items from DB and from scrolled down it loads another 25 and it goes on.
Now, I want to implement search, that should display the result searching the whole data from DB and not just from the displayed 25 items.
I tried to get the search result by sending the whole searching text to DB on clicking search button and there is lot of datas in DB which takes much time to load data.
Please let me know how I can get the desired result from DB within ms. Thanks in advance.
To get a well behaving search with Knockout, you should extend your searchText observable that is bound to the input with a rate-limiter
this.searchText = ko.observable('').extend({ rateLimit: { timeout: 500, method: "notifyWhenChangesStop" } })
This will call any subscribers after the input has remained unchanged for 500ms (i.e. when the user stops typing). You could also use the default method notifyAtFixedRate to call the API at least once every X seconds.
And to top it off, a fiddle!
Note: with this being said, if your query is taking 40 seconds, that sounds like a problem with your database query. It's possible that it's taking that long because you're flooding the server with requests, but that still seems awfully slow. This is the strategy we use and it works excellent, but our API response time is <200ms.

Parsing a large JSON array in Javascript

I'm supposed to parse a very large JSON array in Javascipt. It looks like:
mydata = [
{'a':5, 'b':7, ... },
{'a':2, 'b':3, ... },
.
.
.
]
Now the thing is, if I pass this entire object to my parsing function parseJSON(), then of course it works, but it blocks the tab's process for 30-40 seconds (in case of an array with 160000 objects).
During this entire process of requesting this JSON from a server and parsing it, I'm displaying a 'loading' gif to the user. Of course, after I call the parse function, the gif freezes too, leading to bad user experience. I guess there's no way to get around this time, is there a way to somehow (at least) keep the loading gif from freezing?
Something like calling parseJSON() on chunks of my JSON every few milliseconds? I'm unable to implement that though being a noob in javascript.
Thanks a lot, I'd really appreciate if you could help me out here.
You might want to check this link. It's about multithreading.
Basically :
var url = 'http://bigcontentprovider.com/hugejsonfile';
var f = '(function() {
send = function(e) {
postMessage(e);
self.close();
};
importScripts("' + url + '?format=json&callback=send");
})();';
var _blob = new Blob([f], { type: 'text/javascript' });
_worker = new Worker(window.URL.createObjectURL(_blob));
_worker.onmessage = function(e) {
//Do what you want with your JSON
}
_worker.postMessage();
Haven't tried it myself to be honest...
EDIT about portability: Sebastien D. posted a comment with a link to mdn. I just added a ref to the compatibility section id.
I have never encountered a complete page lock down of 30-40 seconds, I'm almost impressed! Restructuring your data to be much smaller or splitting it into many files on the server side is the real answer. Do you actually need every little byte of the data?
Alternatively if you can't change the file #Cyrill_DD's answer of a worker thread will be able to able parse data for you and send it to your primary JS. This is not a perfect fix as you would guess though. Passing data between the 2 threads requires the information to be serialised and reinterpreted, so you could find a significant slow down when the data is passed between the threads and be back to square one again if you try to pass all the data across at once. Building a query system into your worker thread for requesting chunks of the data when you need them and using the message callback will prevent slow down from parsing on the main thread and allow you complete access to the data without loading it all into your main context.
I should add that worker threads are relatively new, main browser support is good but mobile is terrible... just a heads up!

Speeding up an app that makes many Facebook API calls

I've got a simple app that fetches a user's complete feed from the Facebook API in order to tally the number of words he or she has written total on the site.
After he or she authenticates, the page makes a Graph call to /me/feed?limit100 and counts the number of responses and their dates. If there is a "next" cursor in the response, it then pings that next URL, which looks something like this:
https://graph.facebook.com/[UID]/feed?limit=100&until=1386553333
And so on, recursively, until we reach the time that the user joined Facebook. The function looks like this:
var words = 0;
var posts = function(callback, url) {
url = url || '/me/posts?limit=100';
FB.api(url, function(response) {
if (response.data) {
response.data.forEach(function(status) {
if (status.message) {
words += status.message.split(/ /g).length;
}
});
}
if (response.paging && response.paging.next) {
posts(callback, response.paging.next);
} else {
alert("You wrote " + words + " on Facebook!");
}
});
}
This works just fine for people who have posts a total of up to 4,000 statuses, but it really starts to crawl for power users with 10,000 lifetime updates or more. Each response from the API is only about 25Kb, but I cannot figure out what's straining the most.
After I've added the number of words in each status to my total word count, do I need to specifically destroy the response object so as not to overload memory?
Alternatively, is the recursion depth a problem? we're realistically talking about a total of 100 calls to the API for power users. I've experimented with upping the limit on each call to fetch larger chunks, but it doesn't seem to make a huge difference.
Thanks.
So, you're doing this with the JS SDK I guess, which mean this runs in the Browser... Did you try to run this in Chrome and then watch the network monitor to see about the response time etc.?
With 100 requests, this also means that the data object/JSON must be about the size of 2.5mb, which for some browsers/machines could be quite challenging I guess. Also, it must take quite a while to fetch the data from FB. What does the user see in the meantime?
Did you think of implementing this in the backend on the server side, and then just passing the results to the frontend?
For exmple use NodeJS together with SocketIO to do it on the server side and dynamically update the word count?

Categories

Resources