Please note: this is not exactly same as this question. Although it looks like that, I have some very specific requirements.
In my application, I have two panels to update with a single AJAX request.
Currently, I am sending JSON, and constructing two DOMs in JS and appending them in corresponding places.
This is becoming difficult to maintain, and much processing is happening on the client side. So, I am thinking of sending HTML code snippet.
The problem here is that I have two panels, and with a single AJAX request, how should I bring two different HTML snippets for different panels?
Also: -
1) I am sending the AJAX request periodically after each 30 seconds, and also sending a CRC-32 of the data in the JSON. So, if there is no change in the data I don't have to recreate the DOM.
2) Replacing the old HTML, with new HTML suddenly flickers the page.
3) Each panel can have dynamic number of rows, and I have to attach some events from JavaScript side to each row. For this, it is easier if I have data in JSON format. But if I am switching to HTML snippet, how to do this?
Taconite is something you should look into. It updates multiple HTML blocks with a single request.
http://malsup.com/jquery/taconite/
Related
Let's say we have a group of HTML element like this JS fiddle with certain behaviors. Having one of this is easy on a HTML page.
But what is the best practice of having 1000 of them (the same JS fiddle) on a page? This is not only about dynamically generating required HTML elements but more importantly generating the behaviors/JS codes for each individual element (with different values and id).
An Example:
document.getElementById("ID1").onchange = function () {
//some complicated behaviors
};
Generate the a ID2/ID3/ID4...ID1000 version of the above codes in run time when needed. Note that it must be running on HTML environment with no servers. Select options/data are stored in the same html file (as an array or some sort).
Any advice or suggestions are appreciated.
Ideally, rather than loading up all the possible options in the DOM, you would use AJAX to fetch the next set of options from the server. So when you select the state, it goes and grabs the counties for that state, then when you select the county, it goes and grabs the cities for that state.
To go further, when you select the state, it sends the selected state back to the server to fetch all the counties for that state. Then when you select the county, it sends that back to the server so it can fetch the cities.
You can use the server side language of your choice of course (PHP, ASP.net, Ruby, Python, etc.)
Otherwise, you're injecting a ton of data in to your DOM that will never get used and that would vastly increase load times; whereas AJAX is generally pretty fast.
There are lots of tutorials regarding the use of AJAX so I'm not going to cover that in the answer, but that is your best solution.
You tagged the questions only with Javascript so I assume that you have the data locally and you don't retrieve it from a server.If you take the data from the server use Ajax and get only the data that you need at that moment.
With Javascript you can use
document.createElement('element'); // this is faster than Jquery if you generate a lot of data
with Jquery you can use
$(document.createElement('element'))
I would like to know, what's the best method to retrieve and use a large amount of dynamic data.
For example:
I have a big website with a lot of fields, which dynamically create popups. The popups are created with a Javascript template engine, which needs JSON encoded data.
Now what I can do:
Every time i request a popup, the client fetches the JSON data via AJAX
I can create a Javascript var via PHP, which stores the data for all possible popups in the HTML code
Or I can fetch the data via AJAX and cache it, in a Javascript var
So which one of these is the best one?
What are the disadvantages of them?
Or how would you attach/load the data for these popups?
BTW does anybody know why all the facebook popups are so smooth? It seems that they are created asynchronously, but they are so fast - like they were already embedded.
Pre-emptive caching.
Basically your 'pop-ups' (god knows why you have so many - there must be a better way :-D hehe) will have a pattern or logical order or whatever.
Using a combination of:
Loading the Main / Most likely to be first used pop-ups data and storing that in a var.
I would highly recommend trying to do this with JSON or similar and store data for 10-20 pop-ups together - downside is performance - have to parse whole file for 1 pop-up (but modern browsers / PCs - not much issue) - plus side number of http requests - the killer of site speed.
You COULD** start loading data for a button etc. on HOVER (as well as click) - milliseconds make prizes you know!
Finally - just ajax the data in and keep it small - the more you can strip out of the ajax call and pre-load (image sprites on page load etc. etc.) the faster your site will respond.
However without knowing:
how often the data will update
what sort of data you are sending (is it all graphs, all text etc.)
how many of these pop-ups you have
how often a new pop-up will be loaded
what device(s) you users will be using
etc.
I can only give wild stabs in the dark!
I am working on the same thing now and find a good introduction blog http://blog.mariusschulz.com/2014/02/05/passing-net-server-side-data-to-javascript, hope it can give you some suggestions.
The title phrases it badly so here's a longer description :
I have an application that exports data in html format. ( 500 rows, 20 columns)
It looks terrible with lots of useless columns.
I want to use something like datatables to make a more usable table, i.e. paging/sorting/filtering/hiding columns
The option I'm trying first is to insert the table from the exported html file using the .load() function from jquery. Then I loop through the table deleting/modifying columns.
This seems very slow (I suspect my looping and searching) so I'm looking for improvements.
One idea is to pre-convert my exported html file to json (using notepad++ macros or something like that) and then build the table that I want from that json file.
Any opinions on whether I can expect a large performace boost, or potential problems to look out for ?
Many thanks / Colm
JSON should be faster, when its loaded its ready to go without all of the text parsing you would need to do with a text file. Lots of other jquery addons available to make it easy for you once it is in JSON.
I think this is not about which loads data faster but which solution is better for your problem. Datatables is very flexible and you can load from different sources. Take a look at the "Data Sources" and "Server side processing" in the examples: http://datatables.net/examples/
Datatables uses mostly JSON format. To process your data need to find the best approach; convert your exported html file, process the file with javascript to convert data (jquery can help you here), etc..
This page gives some real world examples of loading data in json vs data in a html table. Fairly conclusive, see the post from sd_zuo on July 2010, a fourfold increase in speed loading from json and then just building the table that you want to display.
Granted the page deals specifically with the slowness of the innerHtml function in IE8 but I think I'll give it a go in json and see how it compares across a couple of browsers.
P.S. This page gives good advice on fast creation of html using raw javascript and then only using jquery to insert one full row at a time
What is your opinion (PRO an CONS) about returning HTML code as a result for an AJAX call. It is, if the app creates a new item in a list and it needs some extra parameters or some pattern customization, instead of modify it through JS, we can send it templatized through an AJAX call.
The point is that HTML snippets are sent from the server to the client computer and integrated in the document DOM. Any problem with this approach?
No problem with it at all, perfectly normal and reasonable thing to do.
There is sometimes a use-case for sending data rather than markup and expanding it with client-side templating, but that's mostly for situations where you're sending a lot of data and so want to keep the size on the wire down. (E.g., a large table where the HTML representation of it is 100k but the raw data in, say, JSON format would only be 10k.) Or when the templating varies depending on client-side conditions. But by and large, perfectly fine to send HTML you then incorporate into the DOM via innerHTML (or any of several libraries' wrappers for it that help you with the odd niggle).
This is a common approach.
If you're adding items to a list or replacing the contents of a pod with something completely different this is fine.
This also makes it easier to apply AJAX to existing sites (for example overlays or something) because you can make requests to existing pages and then strip out the bits you don't want.
However, it would be better for updates where only a value is changing then you should perhaps use Json there.
Personally, I almost always choose to receive a JSON response with no markup or formatting applied, but that's just because I like having a really flexible, granular response so I can do whatever I want with the returned data, without having to possibly strip it out of HTML. This is NOT necessarily the easiest or most elegant solution in a lot of cases! :)
What are some of the better solutions to handling large datasets (100K) on the client with JavaScript. In particular, if you have multi-column sort and search capabilities, how do you handle fetching (and pre-fetching) the data, client side model binding (for display), and caching the data.
I would imagine a good solution would be doing some thoughtful work in the background. For instance, initially, if the table was displaying N items, it might fetch 2N items, return the data for the user, and then go fetch the next 2N items in the background (even if the user hasn't requested this). As the user made search/sort changes, it would throw out (or maybe even cache the initial base case), and do similar functionality.
Can you share the best solutions you have seen?
Thanks
Use a jQuery table plugin like DataTables: http://datatables.net/
It supports server-side processing for sorting, filtering, and paging. And it includes pipelining support to prefetch the next x pages of records: http://www.datatables.net/examples/server_side/pipeline.html
Actually the DataTables plugin works 4 different ways:
1. With an HTML table, so you could send down a bunch of HTML and then have all the sorting, filtering, and paging work client-side.
2. With a JavaScript array, so you could send down a 2D array and let it create the table from there.
3. Ajax source - which is not really applicable to you.
4. Server-side, where you send data in JSON format to an empty table and let DataTables take it from there.
SlickGrid does exactly what you're looking for. (Demo)
Using the AJAX data store, SlickGrid can handle millions of rows without flinching.
Since you tagged this with Ext JS, I'll point you to Ext.ux.LiveGrid if you haven't already seen it. The source is available, so you might have a look and see how they've addressed this issue. This is a popular and widely-used extension in the Ext JS world.
With that said, I personally think (virtually) loading that much data is useless as a user experience. Manually pulling a scrollbar around (jumping hundreds of records per pixel) is a far inferior experience to simply typing what you want. I'd much prefer some robust filtering/searching instead of presenting that much data to the user.
What if you went to Google and instead of a search box, it just loaded the entire internet into one long virtual list that you had to scroll through to find your site... :)
It depends on how the data will be used.
For a large dataset, where the browser's Find function was adequate, just returning a straight HTML table was effective. It takes a while to load, but the display is responsive on older, slower clients, and you never have to worry about it breaking.
When the client did the sorting and search, and you're not showing the entire table at once, I had the server send tab-delimited tables through XMLHTTPRequest, parsed them in the browser with list = String.split('\n'), and updated the display with repeated calls to $('node').innerHTML = 'blah'. The JS engine can store long strings pretty efficiently. That ran a lot faster on the client than showing, hiding, and rearranging DOM nodes. Creating and destroying new DOM nodes on the fly turned out to be really slow. Splitting each line into fields on-demand seems to work; I haven't experimented with that degree of freedom.
I've never tried the obvious pre-fetch & background trick, because these other methods worked well enough.
Check out this comprehensive list of data grids and
spreadsheets.
For filtering/sorting/pagination purposes you may be interested in great Handsontable, or DataTables as a free alternative.
If you need simply display huge list without any additional features Clusterize.js should be sufficient.