Maximum json size for response to the browser - javascript

I am creating tree with some custom control prepared with JavaScript/jquery.
For creating the tree we are supplying json object as the input to java-script to iterate through and create the tree.
Since the volume of data may go up-to 25K nodes. during a basic load test we identified that the browser will be crashed for such volume.
The alternate solution is just load first level of the nodes and rest load on demand via AJAX request. the volume of first level can vary up-to 500 - 1K nodes.
What is the max size a json should have as a response from the server. What could be the best approach to process such volume of data on browser.

There is no max size limit of the http response (or the max size of Int or the limit of browser or the limit of server have been configured).
The best approach is use AJAX to load part of data while it need to be shown.

An HTTP response has no size limit. JSON is coming as an HTTP response. So it has no size limit either.
There might be problem if the object parsed from JSON response consumes too much memory. It'll crash the browser. So it's better you test with different data sizes and check whether your app works correctly.
I think lazy-loading is the best approach for such large amounts of data. Especially when dealing with object literals.
See High Performance Ajax Application presentation from Yahoo.

Well I think I am too late to give my two cents. Complementing shiplu.mokadd.im's answer browser memory is a limitation and HTTP response can have any amount of data according to the TCP spec.
But I have an application that uses Google Chrome (version 29.0.xx) and Jetty server where response from the Jetty server has a payload amounting to 335MB. While the browser is receiving the response of that sheer size Chrome stops leaving the message "IPC message is too big". Though this is specific to Google Chrome(not sure about other browsers), there should be a threshold on the max size of response.

There is no max size limit but the size depends the client system's (The system in which browser exists) RAM, CPU, Network bandwidth to parse the large json data. If the system is low end and with the large json data then the browser hangs.

Related

AngualrJS - Like "real time" http requests

I am using AngularJS 1.3 and I have a backend that supports only HTTP requests. (without WebSockets).
What is the best option for most "real time" data updates?
Right now I am using $interval and sending http request every second but I am not so satisfied I am always thinking maybe there is a better option.
Thank you !
Based on your description, there's not really an alternative, but you may be able to optimize the behavior based on the characteristics of your data and/or the user interface.
To minimize the resource consumption, for example, pause any requests when the relevant aspects of the interface aren't visible (e.g. in a different "page" or even if the user has switched to a different browser tab).
If the data is high volume but doesn't change frequently, you could set up your server to return a 304 Not Modified until the data actually changes.
You could also send only diffs instead of the full data set if that results in a significant bandwidth savings.

Fast database for JavaScript data visualization vs minifying CSV?

I have a 10 MB CSV file that is the fundamental data source for an interactive JavaScript visualization.
In the GUI, the user will typically make a selection of Geography, Gender, Year and Indicator.
The response is an array of 30 4-digit numbers.
I want the user experience to be as snappy as possible and am either considering delivering the full CSV file (compressed using various means ...) or having a backend service that almost matches locally hosted data.
What are my options and what steps can I take to deliver the query response with maximum speed?
To deliver the full file maybe a combination of String Compression algorithmns such as http://code.google.com/p/jslzjb/ combined with html5 web storage.
However if its not really necessary to have the full db at the users client (which might lead into further problems regarding db updates, security, etc.) i would use a backend service with query caching etc.
I wouldn't transfer it to the client, Who knows how fast their connection is? Maximum speed would be to create an api, and query it from your mobile client. This would only mean a request for data is transfered from the client (small in size) and a response is returned to the client (only 30 4 digit number)

High traffic solution for simple data graph website

I'm building a single page website that will display dynamic data (updating once a second) via a graph to its users. I'm expecting this page to receive a large amount of traffic.
My data is stored in REDIS and I'm displaying the graph using Highcharts. I'm using ruby / Sinatra as my application layer.
My question is how best should I architecture the link between the data store and the JavaScript graph solution?
I've considered directly connecting to REDIS but that seems the least efficient . I'm wondering whether a XML solution where ruby builds an XML file every second and then Highcharts pulls data from there is the best as therefore the stress is only on hitting that XML file.
But I wanted to see whether anyone on here might have solved this previously or had any better ideas?
If the data is not user-specific, you should cache it into a representation that is easily read by the client. With web browsers, JSON might be a better choice.
You can cache it using Redis itself. (Memcached, Varnish are other options) You should cache it every time the data arrives and must avoid transforming the data on each request. The requests must simply serve pre-computed information from the cache (like you do with static information)
For a better experience on the client side, you should minimize the amount of data you are downloading from the server. JSON serves this purpose better than XML.

Size limit for the return value of a jQuery or JavaScript ajax get method?

When I use an ajax GET method, is there a size limit for the return value?
If so, is this limit due to JavaScript/jQuery? To the browser? to the server?
I have seen older articles (5+ years old) that seem to indicate that there was a size limit, but I couldn't find any recent information on this topic.
Yes there is a limit. Its dependent upon architecture and available memory. On a x86_64 machine with 2G and a 64bit build, it will fail with InternalError: script stack space quota is exhausted however on a x86_64 with 4G and a 64bit build it will pass.
So it ultimately depends on the machine that is running and hosting the script to efficiently serve the request.
If you receive an InternalError Script Stack message it just means you need upgraded hardware to run the request(s).
Ways to "control" this are to do whats called "throttling". See another stack for reference on this:
How to rate-limit ajax requests?
The HTTP protocol does not limit the size of any response, and neither does jQuery itself. So theoretically speaking, you can make your GET response as big as you want.
In practice, however, there are some constraints: the web server may have a maximum response size (especially when you're caching entire responses), server code may need intermediate processing steps that require additional storage, the client needs to store the response data somewhere before processing it, the javascript implementation on the browser may have extra memory constraints (you don't want a rogue script eating up all your physical RAM: that would be a severe security problem). How much exactly is too much is pretty hard to pinpoint though: a few megabytes are probably OK, a terabyte certainly isn't.
Note that the request method is pretty uninteresting in this context; the above issues are just as valid for responses to POST requests, or any other HTTP verb.

Checking for updates using AJAX - bandwidth optimization possibilities?

I'm creating a simple online multiplayer game, with which two players (clients) can play the game with each other. The data is sent to and fetched from a server, which manages all data concerning this.
The problem I'm facing is how to fetch updates from the server efficiently. The data is fetched using AJAX: every 5 seconds, data is fetched from the server to check for updates. This is however done using HTTP, which means all headers are sent each time as well. The data itself is kept to an absolute minimum.
I was wondering if anyone would have tips on how to save bandwidth in this server/client scenario. Would it be possible to fetch using a custom protocol or something like that, to prevent all headers (like 'Server: Apache') being sent each single time? I basically only need the very data (only 9 bytes) and not all headers (which are like 100 bytes if it's not more).
Thanks.
Comet or Websockets
HTML5's websockets (as mentioned in other answers here) may have limited browser support at the moment, but using long-lived HTTP connections to push data (aka Comet) gives you similar "streaming" functionality in a way that even IE6 can cope with. Comet is rather tricky to implement though, as it is kind of a hack taking advantage of the way browsers just happened to be implemented at the time.
Also note that both techniques will require your server to handle a lot more simultaneous connections than it's used to, which can be a problem even if they're idle most of the time. This is sometimes referred to as the C10K problem.
This article has some discussion of websockets vs comet.
Reducing header size
You may have some success reducing the HTTP headers to the minimum required to save bytes. But you will need to keep Date as this is not optional according to the spec (RFC 2616). You will probably also need Content-Length to tell browser the size of the body, but might be able to drop this and close the connection after sending the body bytes but this would prevent the browser from taking advantage of HTTP/1.1 persistent connections.
Note that the Server header is not required, but Apache doesn't let you remove it completely - the ServerTokens directive controls this, and the shortest setting results in Server: Apache as you already have. I don't think other webservers usually let you drop the Server header either, but if you're on a shared host you're probably stuck with Apache as configured by your provider.
html5 sockets will be the way to do this in the near future.
http://net.tutsplus.com/tutorials/javascript-ajax/start-using-html5-websockets-today/
This isn't possible for all browsers, but it is supported in newer ones(Chrome, Safari). You should use a framework that uses websockets and then gracefully degrades to long polling(you don't want to poll at fixed intervals unless there are always events waiting). This way you will get the benefit of the newer browsers and that pool will continue to expand as people upgrade.
For Java the common solution is Atmosphere: http://atmosphere.java.net. It has a jQuery plugin as well as a abstraction the servlet container level.

Categories

Resources