ajax request browser limit - javascript

A more generic questions to start. Is there a limit to the response size of an ajax request, if it is a JSON request?
I am passing large amounts of data through a JSON request and running into a 'script stack quota is exhausted' message in FF3. Now in FF2 the quota was 4mb but in FF3 it is 640kb. I am wondering if this is JSON specific somehow. Do normal ajax requests have a response size limit? One that may be imposed by the browser? If a non-JSON request doesn't have these same issues with script stack quota, how could I categorize the data coming back? XML perhaps...Im not sure if I would be within the bounds of the w3c spec with my data to do so.

iirc this was a bug on FF3 last year but I believe (yes, checked it here) it's fixed. Looking down the comments though, there's this note:
Note: this test is dependent upon
architecture and available memory. On
a x86_64 machine with 2G and a 64bit
build, it will fail with
InternalError: script stack space
quota is exhausted however on a x86_64
with 4G and a 64bit build it will
pass.
The comments also read that this is a pure JS problem, which means although the data format will strictly not matter, very large chunks of JSON might blow the JS stack where XML strings might not. I think you just have to try.
OTOH, it is marked fixed, so there's a question of making sure you're on the latest version of FF too.

I suspect the limits are different if you are sending vs receiving data, so I am going to assume it is around sending data to the client. JSON is just a data type, really. What you are really doing, is suspect, is making a GET request for a javascript script which should be limited to a reasonable size. The wiki for JSON also says use the XMLHTTPRequest method, which might get around your limit, but you would still need a proxy to avoid cross-domain scripting limitations and use a more sensible mime-type, like html, xml, binary, etc. If you are putting any images in the JSON remember that they can be links as there are no cross-domain issues with those requests.
Double check as well that it is not the number of requests causing you trouble, browsers have limits there too. Sometimes as low as 2.

I think annakata is right.
The text of the error message also suggest that the problem is occurring due to the depth of your json structure, not the KB size of it.
What it means is that when you eval your json, JavaScript engine uses a stack during parsing the json. This stack is hitting its maximum limit due to the depth (number of nested elements) in your json structure.
You might want to check whether somewhat flatter structure is feasible for your requirements.

As a general rule I try to keep my AJAX data small. If I have to pass a large amount of data I will retrieve it with multiple calls. So if I am loading a table, I will have one method that will tell me how many records are going to be returned, and another method to return me the records in groups of # (usually 20 for me).
The good part about doing this is that I can load the page as I retrieve data, and the user is not waiting for one large payload.
Also, it would be better to use JSON rather than XML. JSON is usually a smaller payload than XML, and many tests have show that it is easier for the Browser to load it in.

I haven't encountered any tangible limit, but your user interactivity, break up the large data into multiple calls. Large tables take forever to get transferred through ajax, especially if the user is running IE. Large data + Ajax + IE = IE crash.

Related

how to solve data loading issue in Chrome

All:
I wonder how can I make Chrome handle about 4GB data loaded into it? My use case is:
The Front End starts and tries to download 3GB json file data and makes some calculation. But Chrome always crash.
Any solution for this? Thanks
When you work with large data typically optimization rule is :
Don't read all data at once, don't save all data at once.
If your code allows perform calculations "step-by-step", split your JSON to small parts (for example, by 50Mb).
Of course, it works slowly, however this approach allows to keep memory.
This optimization rule is useful not only for JS and browser, but for various languages and platforms.

Parsing "Streaming" JSON

I have a grid in a browser.
I want to send rows of data to the grid via JSON, but the browser should continuously parse the JSON as it receives it and add rows to the grid as they are parsed. Put another way, the rows shouldn't be added to the grid all at once after the entire JSON object is received -- they should be added as they are received.
Is this possible? Particularly using jQuery, Jackson, and Spring 3 MVC?
Does this idea have a name? I only see bits of this idea sparsely documented online.
You can use Oboe.js which was built exactly for this use case.
Oboe.js is an open source Javascript library for loading JSON using streaming, combining the convenience of DOM with the speed and fluidity of SAX.
It can parse any JSON as a stream, is small enough to be a micro-library, doesn’t have dependencies, and doesn’t care which other libraries you need it to speak to.
You can't parse incomplete or invalid JSON using the browser's JSON.parse. If you are streaming text, it will invariably try and parse invalid JSON at some point which will cause it to fail. There exists streaming JSON parsers out there, you might be able to find something to suit your needs.
Easiest way in your case would remain to send complete JSON documents for each row.
Lazy.js is able to parse "streaming" JSON (demo).
Check out SignalR.
http://www.hanselman.com/blog/AsynchronousScalableWebApplicationsWithRealtimePersistentLongrunningConnectionsWithSignalR.aspx
March 2017 update:
Websockets allow you to mantain an open connection to the server that you can use to stream the data to the table. You could encode individual rows as JSON objects and send them, and each time one is received you can append it to the table. This is perhaps the optimal way to do it, but it requires using websockets which might not be fully supported by your technology stack. If websockets is not an option, then you can try requesting the data in the smallest chunks the server will allow, but this is costly since every request has an overhead and you would end up doing multiple requests to get the data.
AFAIK there is no way to start parsing an http request before it's finished, and there is no way to parse a JSON string partially. Also, getting the data is orders of magnitude slower than processing it so it's not really worth doing.
If you need to parse large amounts of data your best bet is to do the streaming approach.

Passing array from javascript to ASP.NET

I have a simple 2D array in javascript. I want to pass this array to a ASP.NET page.
I wanted to know what would be the best option, going with JSON or XML.
The metric is speed and size of the data. In some cases, the array may be long in size.
Thank You.
The metric is speed and size of the data.
JSON is faster then XML in terms of speed. It's smaller then XML in terms of size.
XML is bloated to allow you to represent and validate structures.
However there are various BSON formats around where people take JSON and then hand optimise the storage format excessively. (BSON is binary JSON)
Some BSON spec I picked from google
Bison, A JavaScript parser for some arbitary BSON format.
Now if you really have bottlenecks with transferring data (which you probably don't) you may want to use WebSockets to send data over TCP rather then HTTP, thus reducing the amount of traffic and data you send.
Of course you only care about that if you making say X000 requests per second.
JSON Should be your best bet XML datatype sending might be a big pain as sometimes you would have to add in new configs just to support XML datatype to be sent as form data to the server. Genreally it is not a recommended practice due to security concerns

Maximum size of ajax returned data

Fellow coders, I have asked this question before but did not get a conclusive answer to it. The question is: how much data can i safely return from and ajax post call before i run into some limitation somewhere?
The scenarios are basically like this: front-end makes an ajax call to a php controller/model. the controller returns a bunch or rows from the database or returns some html representing some report which will be stored in a js string var to be displayed later.
I see two limitations here: the size of the data returned through the ajax call and max size the js var can hold.
Anyone knows what the limits are?
thanks
See this answer: Javascript maximum size for types?
In short, unless the browser specified otherwise, variable sizes are not subject to a restriction. As for Ajax: There's no limit, unless defined server-side (such as this one).
I don't think either factor you listed would be an issue. What I would look at are:
The amount of time the user is willing to wait for the response. Also, your server-side programming language or web server might impose a limit on the length of any one request.
The amount of RAM the client has. Even if there is no variable size limit, eventually the computer will run out of space.
In these situations, you are almost always better off delivering smaller chunks of the data at a time and allowing the user to load what data they need (either by granulation [showing summaries and letting them drill down], or pagination / search). No one wants to wait 10 minutes for the site to load, and HTTP doesn't really handle large requests all that well.

Parsing very large JSON strings in IE causing problems

I'm parsing a 2MB JSON string in IE8. The JSON.Parse line is taking a little while to return and IE8 shows a message asking the user if they want to abort the script.
Is there any way I can suppress this message? (or somehow speed up JSON.Parse)
I know about Microsoft KB175500, however this is not suitable as my target users will not have administrator access to make the registry modifications on their SOE machines.
I had this same question. Apparently there is no way to suppress the message, but there are tricks to make IE think it's still working by using an asynchronous iteration pattern (dead link, view comments below).
This comes from an answer to one of my questions:
loop is too slow for IE7/8
If the browser is unhappy with how long the JSON parser is taking, there are only four choices here I know of:
Get a faster JSON parser that doesn't take so long.
Break up your JSON data into smaller pieces so you are only parsing smaller pieces at once.
Modify a JSON parser to work in chunks so it can parse part of the data in one chunk, then on a short timeout, parse the next chunk, etc... This will prevent the browser prompt, but is probably a lot of work to write/modify a JSON parser that works this way.
If you can be sure the content is safe, then you could see if using eval instead of a JSON parser works around the issue.

Categories

Resources