I'm using backbone.js to build a web site. It runs fine in our development server, but it didn't parse data in client's server using IE7-9. (Firefox, Safari are fine)
I found that backbone call fetch() to request data, but it don't start parse() when data return. I can't find success or error return. There is a part of code:
EMR.CategoriesCollection = Backbone.Collection.extend({
url : 'contents/json/categories.txt',
initialize:function () {
console.log('Get data from:', this.url);
this.fetch();
},
model: EMR.ItemModel,
parse : function(data) {
console.log("CategoriesCollection parse data:", [data]);
return data;
}
});
Please compare two links below ( in IE 7-9 ), they are same set of code, just server difference:
It works:
1) http://pms.dq.hk/clients/amex_eMR/client/cn/emr/test_data.html
It fail:
2) http://qwww.americanexpress.com/hk/cn/emr/test_data.html
If backbone sent fetch(), but parse() didn't trigger, how to trace the issue?
Any possible ways I can try?
Thanks for your suggestion.
Regards,
Michael
the only differences between the two that I can see are the response headers..
your server:
Connection Keep-Alive
Date Thu, 31 May 2012 03:41:16 GMT
Etag "ae6691-123ac-59de3b00"
Keep-Alive timeout=15, max=98
Server Apache/2.0.63 (Unix) PHP/5.3.2 DAV/2
client server:
Connection keep-alive
Content-Type text/plain; charset=big-5
Date Thu, 31 May 2012 03:41:25 GMT
Last-Modified Tue, 22 May 2012 06:41:10 GMT
Vary Accept-Encoding
the content-type in the client server response caught my eye, but I'm no IE expert.. you might want to mention what version of IE you're having issues with and test some other versions to isolate it..
Any possible ways I can try?
Hello. Did you try to add a error callback to the fetch method?
Also, the «fail» server send gzipped content in gzipped form. Maybe IE can't handle gzipped JSON in correct way? Or maybe you can try to change the Content-Type header from «text/plain» to «application/json»?
The problem is here:
http://forum.jquery.com/topic/the-problem-with-ie8-and-encoding-error-c00ce56e
Our client's server return data in
Content-Type text/plain; charset=big-5
When I added a error handler in fetch(), I can see a error return "Could not complete the operation due to error c00ce56e". It is the reason parse() doesn't work.
I can't change client's server setting, so I put the data file to other directory that returns Content-Type text/plain; to fix the problem.
Related
I have a site deployed to Heroku with which I use Cloudflare as a proxy.
For some front-end behaviour I'm using Vue with Vue-resource to perform some ajax requests. The responses are in the form of JSON with a Content-Type:application/json header.
Strangely when my scripts are erroring because the responses are not being parsed to a javascript array/object by vue-resource, despite this being the expected behaviour. However, this problem is only occuring when the site is accessed via the Cloudflare URL. If I access it from the Heroku url (e.g. foo-bar-1234.herokuapp.com) all works fine. It also is working fine on my local dev environment (Vagrant, Nginx).
The problem also only occurs in Chrome. It works fine in Safari and Firefox regardless of whether it's Cloudflare or not.
I believe the problem must lie with something that CloudFlare is doing to the response from the Heroku server that is stopping Vue Resource in Chrome from properly parsing the response.
For reference here is the response from the CloudFlare-served request:
cache-control:no-cache
cf-ray:2d2f1980573d3476-LHR
content-encoding:gzip
content-type:application/json
date:Mon, 15 Aug 2016 19:37:10 GMT
server:cloudflare-nginx
set-cookie:sam_session=[redacted]; expires=Mon, 15-Aug-2016 21:37:10 GMT; Max-Age=7200; path=/; HttpOnly
status:200
via:1.1 vegur
x-ratelimit-limit:60
x-ratelimit-remaining:59
vs the heroku-served one:
Cache-Control:no-cache
Connection:keep-alive
Content-Type:application/json
Date:Mon, 15 Aug 2016 19:40:10 GMT
Server:Apache
Set-Cookie:sam_session=[redacted]; expires=Mon, 15-Aug-2016 21:40:10 GMT; Max-Age=7200; path=/; HttpOnly
Transfer-Encoding:chunked
Via:1.1 vegur
X-Ratelimit-Limit:60
X-Ratelimit-Remaining:58
The headers seem pretty similar so I can't think how the responses differ in a way that would cause this...
Any debug suggestions appreciated.
Update: The compiled javascript can be seen here (large file): https://github.com/samarkanddesign/samarkand-web/blob/master/public/js/admin.js
It's a large file so better to look at the repo: https://github.com/samarkanddesign/samarkand-web/tree/master/resources/assets/js
Update 2: example of a vue-resource ajax request executed in both situations:
CloudFlare
Direct from Heroku:
We can see the in the heroku example that the response data is resolved as an array of objects whilst in the CloudFlare one it's a string.
I believe the problem lies with Vue-Resource and the fact that CloudFlare (SPDY protocol) uses lower-case response headers, whilst Heroku Apache uses capitalised.
Vue Resource is incorrectly ignoring lowercase headers in its interceptors resulting in it missing parsing the response body.
This is referenced in this bug issue: https://github.com/vuejs/vue-resource/issues/317
And a quick way to patch it is via a custom interceptor:
Vue.http.interceptors.unshift(function(request, next) {
next(function(response) {
if(typeof response.headers['content-type'] != 'undefined') {
response.headers['Content-Type'] = response.headers['content-type'];
}
});
});
I am somewhat new to working with http promise objects.
I am using Angular JS to return json from an API http://www.asterank.com/api.
I have an angular controller making the call like so:
$http.jsonp('http://www.asterank.com/api/asterank?query={%22e%22:{%22$lt%22:0.1},%22i%22:{%22$lt%22:4},%22a%22:{%22$lt%22:1.5}}&limit=1&callback=JSON_CALLBACK').
success( function(data) {
console.log('great success');
}).error( function(r,t,et){
console.log(r);
console.log(t);
console.log(et);
});
When I check out Chrome's network monitor I see the response:
HTTP/1.1 200 OK
Date: Sun, 06 Oct 2013 19:06:26 GMT
Content-Type: application/json
Transfer-Encoding: chunked
Connection: keep-alive
Vary: Accept-Encoding
Server: gunicorn/18.0
Set-Cookie: session=eyJkaXNjb3Zlcl9maXJzdF90aW1lIjpmYWxzZX0.BTNGMg.eF89vHEeIpLH8sZiJOwCAJEjPhA; HttpOnly; Path=/
Content-Encoding: gzip
But I am seeing the error method fire, never the success :(
Is this simply because the server does not support JSONP? How do you access the data of these APIs if they don't support JSONP but they support JSON?
Found a nice solution:
Just in case anyone comes across this and like me am using EXPRESS, you create create a simple little API on your server using this:
https://npmjs.org/package/request
Here I don't need to spin up a whole proxy server, but you can request the JSON data from your server.
Only problem here is that site doesn't even declare JSONP support.
How do you access the data of these APIs if they don't support JSONP
but they support JSON?
Write your proxy layer on backend.
Try like this:
var url = 'http://www.asterank.com/api/asterank?query={%22e%22:{%22$lt%22:0.1},%22i%22:{%22$lt%22:4},%22a%22:{%22$lt%22:1.5}}&limit=1&callback=json_callback';
$http({method: 'GET', url: url })
.success(function(data,status,headers,config){
console.log(data);
}).error(function(data,status,headers,config){
console.log('API CALL ERROR'+status);
});
};
I have a web application that lets the browser cache AJAX requests result for a long time. I have found out how to make a request that bypasses the cache entirely, when probable modifications are detected. But I would want to let the user trigger a data refresh.
In this scenario, I'd like the browser to check with the server if the cache is stalled but use it if it is not (that is, if the server responds with a 304 code). The goal is to spare the loading time because the data is huge.
The server includes the following headers in all responses:
Cache-Control: private, max-age=604800
Last-Modified: ... # actual last modification date
I managed to burst the cached object entirely in Chrome (not tested other browsers yet) by using the following HTTP headers in the request:
Cache-Control: max-age=0
If-Last-Modified: Tue, 01 Jan 1970 01:00:00 +0100
The If-Last-Modified line is the one that really has an effect. Chrome seems to ignore the Cache-Control header in the request.
I have also found that using Cache-Control: must-revalidate in the server response forces the browser to validate its cache with the server for each request.
But is there any way to revalidate for just one precise request, decided on the client-side?
Note that I'm not specially attached to doing this with HTTP headers, so any other method that I would not be aware of is welcome!
you can add a url parameter which value base on time to clean cache for just one precise request.
$.ajax({
url:"/questions?nocache="+Date.now(),
"success":function(data){
console.log(data);
}
});
I am working on an ajax long polling type application, and I would like to minimize the amount of bandwidth I am using. One of the big costs right now are the client side HTTP headers. Once I have a connection established and a session id stored on the client, I don't really want to squander any more bandwidth transferring redundant http information (such as browser type, accept encodings, etc.). Over the course of many connections, this quickly adds up to a lot of data!
I would really like to just take my XMLHttpRequest and nuke all of the headers so that only the absolute minimum gets transmitted to the server. Is it possible to do this?
You have very little control over request headers, but you can still do a few things -
Reduce the size of the cookie. In general, you only want the session id, everything else can be eliminated and stored server side.
Minimize http referrer by keeping a short URL. The longer your page url, the more data will have to be sent via the http referrer. One trick is to store data in the fragment identifier (the portion of the url after the #). The fragment identifier is never sent to the server, so you save a few bytes over there.
Some request headers are only sent if you had previous set corresponding response headers. For example, you can indirectly control the ETag and if-modified-since request headers.
You may want to consider Web Sockets. Support is pretty good (IE10+).
You may be able to override some of the standard headers using setRequestHeader() before sending the request, but it is possible the browser may not allow overriding of some and it seems there is no way to get a list of headers (besides asking the server to echo them back to you) to know which to try to override.
I think it's possible to remove all headers at least in some browsers.
Take a look at the communication between gmail/calendar apps and the backend from google in chrome (it's not the same in firefox)
it's possible google has some hidden api for the XMLHttpRequest object,
you'll see something like the below output (notice there is no request headers section):
Request URL:https://mail.google.com/mail/u/0/channel/bind?XXXXXXXXXXXXXX
Request Method:POST
Status Code:200 OK
Query String Parameters
OSID:XXXXXXXXXXXXX
OAID:XXXXXXXXX
VER:8
at:XXXXXXXXXXXXXX
it:30
SID:XXXXXXXXXXXX
RID:XXXXXXXXX
AID:XXXXXXXXXX
zx:XXXXXXXXXXXX
t:1
Request Payload
count=1&ofs=211&req0_type=cf&req0_focused=1&req0__sc=c
Response Headers
cache-control:no-cache, no-store, max-age=0, must-revalidate
content-encoding:gzip
content-type:text/plain; charset=utf-8
date:Tue, 09 Oct 2012 08:52:46 GMT
expires:Fri, 01 Jan 1990 00:00:00 GMT
pragma:no-cache
server:GSE
status:200 OK
version:HTTP/1.1
x-content-type-options:nosniff
x-xss-protection:1; mode=block
Hi my problem is a bit wired:
my $.ajax success handler looks like:
function(data){
alert(data);
}
Pretty simple, yeah?
The problem is that data IS ALWAYS 3 character long - no matter what was sent by the server! The characters: 31 65535 8 (using charCodeAt()).
In other browsers (even in IE 8) all works fine.
I've looked at xhr.responseText in complete handler - just the same result.
UPDATE
Full http response
HTTP/1.1 200 OK
Date: Sun, 07 Feb 2010 13:35:39 GMT
Server: Apache/2.2.12 (Ubuntu)
X-Powered-By: PHP/5.2.10-2ubuntu6.4
Set-Cookie: 1111111111111111=UjVXb1Q3WTdUIQ8jXmALbA88VzpRcVcgBzMDcldyUmtWawAyAFpQP1IwASEAbFh%2FDjoLZ1RiBWlWdwBnUGMHZlU2UGBTZFA5B2UMMlJgC29SbVdjVDRZOVRsDzReaQtuDzpXZVFjV2UHYwM1VzNSNlYzAG4AMVAwUjUBIQBsWH8OOgtlVGAFaVZ3AD5QIgdcVWVQNlNhUHIHMAwjUiQLL1JvVyZUOVk8VGkPal54C2wPNVcyUX1XYgdgAzlXL1IzVioAbQA3UG9SdgE4ACRYNg4xC2RUagVxViAAJFA3B3FVW1AzU2JQZQc7DCRSdQs2UidXb1Q2WT1UYA9yXhcLMg92V2lRP1c%2FBzcDLlc1UixWNAB8AC1QNVI7AW4AJ1htDnQLPVQyBT9WMABsUHIHTlVXUBtTQFAgB20MflJnCzZSdFcCVGpZY1Q%2BDz9eLQsuDyxXTlEHV3MHYAMvVzBSOVYmAGcAdlBsUmUBMABtWC4ObAs1VCMFJ1YKADZQMQd3VW1QJFNsUHQHLAxyUmwLflJuV2RUM1k3VHgPYV5oC2sPOVc1UWdXagdgAzBXOVIgVj8AIQ%3D%3D; expires=Tue, 07-Feb-2012 13:35:39 GMT; path=/
Vary: Accept-Encoding
Content-Encoding: gzip
Content-Length: 21
Keep-Alive: timeout=15, max=100
Connection: Keep-Alive
Content-Type: text/html
1
UPDATE !
Well, problem temporary? solved by disabling gzip in .htaccess
The characters you get look weird, but could it be that it's a BOM? It's none of the ones listed in the article, but maybe getCharCode() gets something wrong.
In that case, you would have to check the script you call to request the data. Try saving it explicitly without a BOM.
this is most times an problem with propper encoding of the requested content / displayed content.
Please verify that both use the same format.
Despite ajax is always done in UTF-8 some implementations are weird and they don't allow passing non ascii characters. You have to change them to entities.
If You pass only one character and still get problems - It might be the BOM mentioned by Pekka. You would have to save Your PHP FILE without any spaces before
Second thing is that You might be outputing something at the end of the php file.