My ASP.NET 4.5 .asmx web service (part of a web forms project, not separated) returns an xml string whenever the user double-clicks on a table row so the details for that row can be retrieved from the database.
This, in and of itself, works fine.
In situations however, where the response is somewhat longer (around 200k, 300k char count from what I can see), the web page will throw a js alert popup (saying "Error") and I'll see a 500 Internal Server Error in the Chrome console, no further details. However, when I go to the web method url and enter the arguments manually, I'll get an immediate and complete xml string in response without issues.
I'm using the following (synchronous) js/ajax to call the web method:
function GetDetailsFromTableRow(userID, country, rowName) {
var url = "http://MyServer/MyWebService.asmx/MyWebMethod",
result = '';
$.ajax({
type: "POST",
async: false,
url: url,
contentType: "application/json; charset=utf-8",
dataType: "json",
data: JSON.stringify({
'userID': userID,
'country': country,
'rowName': rowName
}),
success: function (response) {
result = response.d;
},
error: function (x, t, m) {
if (t === "timeout") {
alert('timeout!');
}
else {
alert(t); //<- this gets thrown apparently
}
}
});
return result;
}
From this js you can ascertain that this function is, in fact, NOT throwing a timeout so I'm fairly certain I can eliminate that.
Furthermore, I would assume that I needn't chunk the response myself for data that is merely a few hundred kilobytes but still: I might be wrong.
I'm also quite aware that the better way to call web methods is asynchronously, but I'm succeeding in calling other (substantial) data already (both in async and sync ways) so I'm looking for an answer to the question why this is not working for responses over a certain size (I have responses that are 80kB and some that are 200+kB, the former work, the latter don't) because I'm obviously not understanding some parts of the whole story.
For completeness: an async=true call to the web service causes the exact same behavior.
debugging the js in IE yielded following error:
Error during serialization or deserialization using the JSON JavaScriptSerializer. The length of the string exceeds the value set on the maxJsonLength property
as mentioned in this answer (and corrective comments), adding the following entries to the web.config increased the maxJsonLength value to int.MaxValue:
<system.web.extensions>
<scripting>
<webServices>
<jsonSerialization maxJsonLength="2147483644"></jsonSerialization>
</webServices>
</scripting>
</system.web.extensions>
subsequent tests are all successful. Issue resolved.
Related
In my ASP.NET site, I use Jquery AJAX to load json data (in fact a string) from a webservice when clicking the search button :
$.ajax({
type: "POST",
url: url,
data: JSON.stringify(parameterArray),
contentType: "application/json",
dataType: "json",
success: function(response) {
d = JSON.parse(response.d);
}
When the return string gets too big, the page stop responding. I have to go to web.config and add this property in order for the website to work :
<jsonSerialization maxJsonLength="2147483644"/>
Here is how do the application handle the search result before returning data to the browser :
JavaScriptSerializer serializer = new JavaScriptSerializer();
serializer.MaxJsonLength = Int32.MaxValue;
string strData = dService.searhforData(ipt);
List<Dictionary<string, object>> lRow = processData(strData);
string r = serializer.Serialize(lRow);
return r;
In case the Json string got too long, the page just stop responding, there wasn't any error in the console window. As I debug at the .Net application side, the serializer.Serialize(lRow); went smoothly and successfully return the r, after that, the loading icon on the page just keep spinning. If I press F5 on the page, the search data appears.
My question is, if JQuery's Ajax refers to the web.config for the max json string length, why couldn't I find any information regarding this on the internet ?
I'm sure there is a limit on the amount of data that JSON.parse() can handle, but that's not related to your problem.
Your web.config file holds settings to be used server side. The JS on the client is not related to that at all. If you needed to amend that setting it's because your ASP.Net code was producing a JSON response longer than was previously allowed by the default setting of jsonSerialization.
If you checked the browser console after making the failed request you would most likely have seen an error in the response which guided you to the problem.
I am using the Google maps Javascript API and when the user changes the directions/route on the map (draggable is set to true) I want to send the new route/directionsResult to my webservice backend. The issue I am facing is when I serialize DirectionsResults using JSON.stringify I don't seem to be getting the full list of objects correctly converted to strings.
directionsDisplay.addListener('directions_changed', function () {
sendToBackendService(directionsDisplay.getDirections());
});
function sendToBackendService(result) {
var jsonToSend = JSON.stringify(result);
$.ajax({
type: 'POST',
url: './api/DirectionsUserModified',
data: jsonToSend,
error: processCallbackError,
success: function (apiJson) {
alert("post of new directions success");
}
});
}
The issue will always be related to the local environment of execution of your JavaScript code. So , the version of web browser you are using.
Maybe the object can't be serialized properly because the size limit is reached. The navigator uses the local storage as buffer during json serialization process. If the limit is reached, the String is simply truncated or an error is thrown.
You could have a look to this other post, maybe it'll help
Thanks in advance for any help.
Is it bad practice and/or inefficient to use multiple $.ajax calls in one javascript function? I've been working on one and have a simple testing environment set up on my computer (apache server with php/mysql), and I've noticed that the server will crash (and restart) if I have multiple ajax calls.
I have two ajax calls currently: one passes 4 pieces of data to the php file and returns about 3 lines of code (pulling info from my sql database), the other simply gets the total rows from the table I'm working with and assigns that number to a javascript variable.
Is it just that my basic testing setup is too weak, or am I doing something wrong? See below for the two ajax calls I'm using:
$.ajax({
type: "GET",
url: "myURLhere.php",
cache: false,
data: {img : imageNumber, cap : imageNumber, width : dWidth, height : dHeight},
})
.done(function(htmlpic) {
$("#leftOne").html(htmlpic);
});
$.ajax({
type: "GET",
url: "myotherURLhere.php",
cache: false,
success: function(data) {
lastImage = data
}
})
Short answer: two ajax request on a page is absolutely fine.
Longer answer:
You have to find the balance between minimalising the number of ajax calls to the backend reducing the traffic and communication overhead; and still maintaining a maintainable architecture (so do not pass dozens of parameters in one call to retrieve everything - maybe only if you do it in a well designed way to collect every parameter to send)
Also most likely there's something wrong with your backend setup, try looking into webserver logs
I have the following code:
var statusCheckUrl = "https://www.mydomain.com/webchat/live?action=avail";
$.ajax({
crossDomain: true,
dataType: "script",
url: statusCheckUrl,
success: function(result) {
console.log("result is: "+result);
eval(result);
},
error: function (jqXHR, textStatus, msg) {
unavailable();
},
timeout: 2000,
cache: false
});
If I access the url: https://www.mydomain.com/webchat/live?action=avail in my browser, the response looks like this: var isAvailable = true;
However, my console.log is printing out undefined which is obviously not working as expected.
I am running this code from localhost but thought that the crossDomain: true would overcome any cross domain issues?
How can I resolve this and why is it returning undefined in my success function?
EDIT: I have tried what the person below suggested with regards to the eval but it seems that the result value is always undefined, no matter what. Why am I getting undefined as a result of this ajax call?
The problem is not in the AJAX call, but instead that eval runs in its own scope. The var keyword in the downloaded script is setting a local variable which quickly goes out of scope. Instead you want to set a global variable (remove the var keyword).
See also: Using eval() to set global variables
Side comment: Don't execute code you don't have to, especially dynamically and cross-domain. If all you want to do is get a value - in this case if something is available or not - just return the value. (If you're not in control of the script, but it always looks the same, you could always parse it as a string. You may want to write a script which runs at some interval to check and alert you of any changes in their response format, however.)
Best practice for cross-domain requests is to make your request in your server side framework (.net, php), parse the info and get what's needed, then use your own response (json, text, whatever) back to the page.
As #Ic. said, you shouldn't be executing the code. Decent security risk there.
I'm loading some JSON data from an AJAX query:
$.ajax({'url': url, type: params.method, 'data': data, timeout: this.settings.timeout, success: function(d,a,x){
console.log('request Complete',params.endpoint,params.params);
var json = null;
try {
json = JSON.parse(d);
} catch(e) {
console.error(e);
}
console.log('json');
// omitted for brevity...
}
});
I'm seeing occasional "Aw, Snap" crashes in chrome where the last console.log is the "request Complete" (the error or 2nd log never get shown).
I suppose that it's important to note that the data may be large (sometimes as big as ~15Mb), which is why I'm not printing out d on every request and looking for malformed JSON (yet... I may result to that). FWIW, I've also tried $.parseJSON instead of JSON.parse
Research I've done into the "Aw, Snap" error is vague, at best. My best guess atm is that this is an OOM. Unfortunately, there's not much I can do to decrease the footprint of the result-set.
Is there any way I could, at the least, gracefully fail?
What happens when you tell jQuery the response data is JSON via the dataType property? Doing so should cause jQuery to pre parse and just give you the data. If you're right about what is going on, it seems this might cause a crash too. jQuery does some sanity checks before parsing, though.
$.ajax({
url: url,
type: params.method,
data: data,
dataType: 'json',
timeout: this.settings.timeout,
success: function (d, a, x){
// `d` should already be parsed into an object or array, and ready to use
}
});
If that doesn't help, please post your actual JSON response for us to take a look at.
"#the system" nailed it. 15MB of info coming back is too much for your browser to handle. I tripped upon this trying to see why my 64 Byte encoded image string is crashing the chrome tab, and it's only 200-500KB.
To 'gracefully fail', it seems you will need to have server side logic in place to prevent this from happening; perhaps limit the size of the JSON string. Only have X characters of length in the initial JSON string and add a property for 'isFinishedSending: false' or 'leftOff: [index/some marker]' so you can timeout for a bit and hit your server again with the the place you left off.