API Data Returning Unicode Characters in Console - javascript

I am facing a rather confusing problem since the last two days. I am working on a document management system, that uses an API that pulls in data from SOLR. The data is in tune of around ~15Mbs, and pulls records of more than 4000+ documents. The API has response in this format -
{
"documents": [
{
id: 123,
some_field: "abcd",
some_other_field: "abcdef"
},
{
id: 124,
some_field: "abcd1",
some_other_field: "abcdef1"
}
]
}
Everything works fine in browser. If I hit the endpoint in Chrome or Firefox browser, it gives me the correct output and I am able to see the JSON output.
However, if I try hitting the same API endpoint with a Java or JS code - the response code is 200, but the output in console (Terminal or Eclipse) shows unicode characters like \u0089 \u0078 U+0080 - all the output comes in this way, and since there are around 4000+ records being fetched by the API, the console kinda fills with all of these unicode characters.
The only difference that I see between the requests made from browser and the code is that in browser I can see Content-Encoding : gzip, while I cannot find this header from the code that I written . For eg - in JS code, through Chakram framework, I can check
expect(response).to.be.encoded.with.gzip
mentioned here. However, this returns a failure stating expected undefined to match gzip
What am I missing here? Is this something related to encoding/decoding or something entirely different?
Edit 1 : The Response Headers as seen in Network tab of Chrome :
cache-control: max-age=0, private, must-revalidate, max-age=315360000
content-encoding: gzip
content-type: application/json; charset=utf-8
date: Tue, 22 May 2018 06:07:26 GMT
etag: "a07eb7c1eef4ab97699afc8d61fb9c5d"
expires: Fri, 19 May 2028 06:07:26 GMT
p3p: CP="NON CUR OTPi OUR NOR UNI"
server: Apache
Set-Cookie : some_cookie
status: 200 OK
strict-transport-security:
transfer-encoding: chunked
vary: Accept-Encoding
x-content-type-options: nosniff
x-frame-options: SAMEORIGIN
x-request-id: abceefr4-1234-acds-100b-d2bef2413r47
x-runtime: 3.213943
x-ua-compatible: chrome=1
x-xss-protection: 1; mode=block
The Request Headers as seen in Network tab of Chrome
Accept: application/json, text/plain, */*
Accept-Encoding: gzip, deflate
Accept-Language: en-US,en;q=0.9
Connection: keep-alive
Cookie: some_cookie
Host: abcd.bcd.com
IV_USER: demouser123
IV_USER_L: demouser123
MAIL: demouser#f.com
PERSON_ID: 123
Referer: http://abcd.bcd.com
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/66.0.3359.181 Safari/537.36
X-CSRF-TOKEN: some_csrf_token
Edit 2 : The tests that I am using
describe('Hits required API',()=>{
before(()=>{
return chakram.wait(api_response = chakram.get(url,options));
});
it('displayes response',()=>{
return api_response.then((t_resp)=>{
console.log(JSON.stringify(t_resp));
expect(t_resp).to.have.header('Content-Encoding','gzip');
});
});

This has nothing to do with encoding. The web server in general compresses to gzip to save the bandwidth since its redundant to transfer the whole 15MB file as is refer this article for more about gZip and the its working ( https://betterexplained.com/articles/how-to-optimize-your-site-with-gzip-compression/ ). So where does it went wrong and how it worked in chrome is pretty simple chrome has an inbuilt unicode parser(even an HTML parser) in its devTools which can show you the parsed content rather showing you the wiered text (same can be seen in response tab next to preview tab). why you see wierd text is that you are stingfying the response which will escape special character if any console.log(JSON.stringify(t_resp));. You cannot use something like console.log("response", t_resp); without stringifying in terminal since the terminal doesn't have a JSON or an unicode parser it just prints in text. try removing that console since stringifying a 15mb file is a costly process.
Edit 1:-
if you still want to output in the console here whats to be done.
Since NODE cannot decode gzip by default directly (not with chakram, its just a APItesting platform) you can use zlib to do this. Please find the example snippet
const zlib = require('zlib');
describe('Hits required API',()=>{
before(()=>{
return chakram.wait(api_response = chakram.get(url,options));
});
it('displayes response',()=>{
return api_response.then((t_resp)=>{
zlib.gunzip(t_resp, function(err, dezipped) {
console.log(dezipped);
});
});
});

Try with console.dir to display your values
describe('Hits required API',()=>{
before(()=>{
return chakram.wait(api_response = chakram.get(url,options));
});
it('displayes response',()=>{
return api_response.then((t_resp)=>{
console.dir(t_resp, { depth: null });
});
});
Console.dir

Related

Content-length header doesnt match request body on chunked upload

Hello,
I have a problem with chunked uploading to google-cloud-storage (gcs) using dropzone.js.
What Im Trying To Do:
I want to upload a (bigger) file to google-cloud-storage via dropzone. Since its a quite large file I'm using dropzone's internal function to chunk (and upload) it. I'm trying to upload via signed url that I'm creating, initalizing and afterwards passing to dropzone so that it knows where to send the file. Also I'm adding a Content-Range header to the request so that gcs keeps track of the current file-upload status.
Current Status:
When the download starts, it seems to work. But after the first chunk finishes dropzone tries to reupload the first chunk, because the xhr-response status is 400.
The Problem:
When investigated the xhr request and response it showed that the Content Length and Content-Range header are NOT the same size. And that is also what the response text told me the error is.
My code in .on("sending") by dropzone:
let procChunks = file.upload.chunks; // Get all chunks processed until now
latestChunk = procChunks[procChunks.length-1]; // Select latest chunk
const chunkFirstByte = dz.options.chunkSize * latestChunk.index; // Calc first byte
const chunkLastByte = chunkFirstByte + (latestChunk.dataBlock.data.size-1); // Calc last byte
let header = "bytes "+chunkFirstByte+"-"+chunkLastByte +"/"+ (file.size-1); // Create header value
xhr.setRequestHeader("Content-Range", header); // Set header to xhr
Request Header:
PUT /upload/storage/v1/b/(myBucket)/o?uploadType=resumable&name=test2.fna&upload_id=(myUpLoadID) HTTP/2
Host: storage.googleapis.com
User-Agent: Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:83.0) Gecko/20100101 Firefox/83.0
Accept: application/json
Accept-Language: de,en-US;q=0.7,en;q=0.3
Accept-Encoding: gzip, deflate, br
Cache-Control: no-cache
X-Requested-With: XMLHttpRequest
Content-Range: bytes 0-8388607/13088352
Content-Type: multipart/form-data; boundary=---------------------------428770732237854445893641225227
Content-Length: 8388843
Origin: http://127.0.0.1:5000
Connection: keep-alive
Referer: http://127.0.0.1:5000/upload
TE: Trailers
The request body:
-----------------------------428770732237854445893641225227
Content-Disposition: form-data; name="file"; filename="test2.fna"
Content-Type: application/octet-stream
>NC_000913.3 Escherichia coli str. K-12 substr. MG1655, complete genome
AGCTTTTCATTCTGACTGCAACGGGCAATATGTCTCTGTGTGGATTAAAAAAAGAGTGTCTGATAGCAGCTTCTGAACTG
GTTACCTGCCGTGAGTAAATTAAAATTTTATTGACTTAGGTCACTAAATACTTTAACCAATATAGGCATAGCGCACAGAC
AGATAAAAATTACAGAGTACACAACATCCATGAAACGCATTAGCACCACCATTACCACCACCATCACCATTACCACAGGT
AACGGTGCGGGCTGACGCGTACAGGAAACACAGAAAAAAGCCCGCACCTGACAGTGCGGGCTTTTTTTTTCGACCAAAGG
TAACGAGGTAACAACCATGCGAGTGTTGAAGTTCGGCGGTACATCAGTGGCAAATGCAGAACGTTTTCTGCGTGTTGCCG
ATATTCTGGAAAGCAATGCCAGGCAGGGGCAGGTGGCCACCGTCCTCTCTGCCCCCGCCAAAATCACCAACCACCTGGTG
GCGATGATTGAAAAAACCATTAGCGGCCAGGATGCTTTACCCAATATCAGCGATGCCGAACGTATTTTTGCCGAACTTTT
GACGGGACTCGCCGCCGCCCAGCCGGGGTTCCCGCTGGCGCAATTGAAAACTTTCGTCGATCAGGAATTTGCCCAAATAA
AACATGTCCTGCATGGCATTAGTTTGTTGGGGCAGTGCCCGGATAGCATCAACGCTGCGCTGATTTGCCGTGGCGAGAAA
ATGTCGATCGCCATTATGGCCGGCGTATTAGAAGCGCGCGGTCACAACGTTACTGTTATCGATCCGGTCGAAAAACTGCT
GGCAGTGGGGCATTACCTCGAATCTACCGTCGATATTGCTGAGTCCACCCGCCGTATTGCGGCAAGCCGCATTCCGGCTG
ATCACATGGTGCTGATGGCAGGTTTCACCGCCGGTAATGAAAAAGGCGAACTGGTGGTGCTTGGACGCAACGGTTCCGAC
TACTCTGCTGCGGTGCTGGCTGCCTGTTTACGCGCCGATTGTTGCGAGATTTGGACGGACGTTGACGGGGTCTATACCTG
CGACCCGCGTCAGGTGCCCGATGCGAGGTTGTTGAAGTCGATGTCCTACCAGGAAGCGATGGAGCTTTCCTACTTCGGCG
CTAAAGTTCTTCACCCCCGCACCATTACCCCCATCGCCCAGTTCCAGATCCCTTGCCTGATTAAAAATACCGGAAATCCT
CAAGCACCAGGTACGCTCATTGGTGCCAGCCGTGATGAAGACGAATTACCGGTCAAGGGCATTTCCAATCTGAATAACAT
GGCAATGTTCAGCGTTTCTGGTCCGGGGATGAAAGGGATGGTCGGCATGGCGGCGCGCGTCTTTGCAGCGATGTCACGCG
CCCGTATTTCCGTGGTGCTGATTACGCAATCATCT
Response Text:
Invalid request. There were 8388843 byte(s) (or more) in the request body. There should have been 8388608 byte(s) (starting at offset 0 and ending at offset 8388607) according to the Content-Range header.
As you can maybe see the first 4 lines are NOT from the file.
Additional Information:
The file is pure text data (utf-8 encoded i guess).
Im chunking in 8mb large chunks (recommended by google).
Somewhere on google's api guide I read that when uploading via PUT there should/must be no other data except the file data.
Content-Length header gets added automatically by dropzone.
My Questions:
Where does those 4 lines come from?
Can I (re)set the xhr request body?
Is my problem maybe caused by some formating problem of file-data? (Like bytes, strings)?
In case that's not the problem - do u have any other idea what could be the problem?
THANK YOU VERY MUCH!
Any help appricated! If u need some more information please ask!

Firefox is parsing empty response payload

We're making an XHR request with the following headers (I've simplified a bit):
POST http://localhost:9001/login
Host: localhost:9001
User-Agent: Mozilla/5.0 (Windows NT 10.0; WOW64; rv:51.0) Gecko/20100101 Firefox/51.0
Accept: application/json, text/plain, */*
Content-Type: application/json;charset=utf-8
Content-Length: 67
Then our server responds like this (again simplified):
Status code: 200 OK
Cache-Control: no-cache, no-store
Connection: close
Content-Length: 0
Date: Mon, 27 Feb 2017 17:19:53 GMT
Server: WildFly/9
Set-Cookie: JSESSIONID=123; path=/
There's no payload in the response. Note the Content-Length: 0. But Firefox still tries to parse it as XML. And outputs the following error to the console:
XML Parsing Error: no root element found
Location: http://localhost:9001/login
Line Number 1, Column 1
Note, the server doesn't send a content-type header. And according to RFC 7231 it only has to send a content-type header when there is actual content.
So is this a bug in Firefox or is my research faulty?
Reproducing it yourself
I've written a small server and client to reproduce the problem.
server.js (start with node ./server.js):
const fs = require('fs'), http = require('http');
const server = http.createServer(function (request, response) {
if (request.method === 'POST') {
// send empty response
response.end();
return;
}
// send file content
fs.readFile('.' + request.url, function (error, content) {
response.writeHead(200, { 'Content-Type': request.url === '/index.html' ? 'text/html' : 'text/javascript' });
response.end(content);
});
}).listen(8080);
index.html
<script src="client.js"></script>
client.js
var xhr = new XMLHttpRequest();
xhr.open('POST', 'http://localhost:8080/login');
xhr.send();
When opening the URL http://localhost:8080/index.html in Firefox, there will be an error in the JavaScript console.
Firefox Version 51.0.1
There's no payload in the response. Note the Content-Length: 0
That means there is a payload, and that payload is 0 bytes in size. Consider the difference between null and "" as strings. What you have here is the equivalent of "" when you want the equivalent of null.
Set the status code to 204 rather than 200 to indicate you aren't sending an entity (and remove the content-type, since there's no content-type with no entity).
(For a long time Firefox would still log an error for this case, but thankfully this is finally fixed. Even when it did log the error it would still continue to run any scripts correctly).
Judging from the POST request it seems that you are using XHR/JS to send the request.
So the problem is likely in the code that is processing the result.
(Also, the Content-Type in the request is incorrect, there's no charset parameter on application/json)

Express JS 4.0, serve binary data, request Accept header changes output

Thanks in advance.
Short:
Express JS 4.0 alters the output data, due to the Accept headers in the request.
Is there a way for me to override this behaviour, and just write the same data regardless of the request headers.
When Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8 is present output is changed.
Is there a way I can ignore, remove, override these headers.
Long (probably tl;dr):
I am trying to serve binary data from a Node/ExpressJS app.
I am storing a compressed log file (plain/text), that has been gzipped, base64 encoded and sent to my server app, where it is being stored in a mongo database using mongoose. I know this is probably not optimal, but is currently a necessary evil. This is working fine.
$(gzip --stdout /var/log/cloud-init-script.log | base64 --wrap=0)
Is being used to compress and base64 the data, before it is sent with other data as part of a json post.
The problem occurs when I attempt to retrieve, decode the base64 encoded string and send to the browser as a binary gzip file.
// node, referring to the machine the log came from
var log = new Buffer(node.log, 'base64');
res.setHeader('Content-Disposition', 'attachment; filename=' + node.name + "-log.gz");
res.setHeader('Content-Type', 'application/x-gzip');
res.setHeader('Content-Length', log.length);
console.log(log.toString('hex'));
// res.end(log, 'binary'); I tried this hoping I could by pass, some content-negotiation
res.send(log);
I had this working when using ExpressJS 3.0 using res.send.
But when I updated to ExpressJS 4.0 the downloaded data, ceased to extract properly. The data being pulled down seemingly corrupt somehow.
I started to try and fix this by comparing the downloaded file and the source file in hexidecimal output using xxd or od and found that the downloaded file was different to the source. I also dumped the hex of the NodeJS Buffer just before it is sent to the client to console, and this matches the source.
I have been banging my head against this issued for nearly a day now, and have suspected that NodeJS might be doing something funky with character encoding (UTF-8 v. Buffer v. UTF16 Strings), OS endianess.
Eventually finding none of this the be problem, I had assumed NodeJS had always been outputting the wrong data to the browser, which was correct, but it wasn't "Always" outputting the wrong data.
I had a break through, when I did a curl request to the endpoint, and the data came through as expected (matching the source), I then added the request headers that were sent with my browser requests, and got back the mangled data.
Actual log file:
I'm a log file
Good Request:
> User-Agent: curl/7.37.1
> Host: 127.0.0.1:9000
> Accept: */*
>
< HTTP/1.1 200 OK
< X-Powered-By: Express
< Last-Modified: Tue, 26 May 2015 11:47:46 GMT
< Content-Description: File Transfer
< Content-Disposition: attachment; filename=test-log.gz
< Content-Type: application/x-gzip
< Content-Transfer-Encoding: binary
< Content-Length: 57
< Date: Tue, 26 May 2015 11:47:46 GMT
< Connection: keep-alive
0000000: 1f8b 0808 0256 6455 0003 636c 6f75 642d .....VdU..cloud-
0000010: 696e 6974 2d73 6372 6970 742e 6c6f 6700 init-script.log.
0000020: f354 cf55 4854 c8c9 4f57 48cb cc49 e502 .T.UHT..OWH..I..
0000030: 003b 5ff5 5f0f 0000 00 .;_._....
Bad Request:
> Host: localhost:9000
> Connection: keep-alive
> Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
> User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/43.0.2357.65 Safari/537.36
> Referer: http://localhost:9000/nodes?query=environment%3D5549b6cbdc023b5e26fe6bd4%20type%3Dnat
> Accept-Language: en-US,en;q=0.8
>
< HTTP/1.1 200 OK
< X-Powered-By: Express
< Last-Modified: Tue, 26 May 2015 11:47:00 GMT
< Content-Description: File Transfer
< Content-Disposition: attachment; filename=test-log.gz
< Content-Type: application/x-gzip
< Content-Transfer-Encoding: binary
< content-length: 57
< Date: Tue, 26 May 2015 11:47:00 GMT
< Connection: keep-alive
0000000: 1ffd 0808 0256 6455 0003 636c 6f75 642d .....VdU..cloud-
0000010: 696e 6974 2d73 6372 6970 742e 6c6f 6700 init-script.log.
0000020: fd54 fd55 4854 fdfd 4f57 48fd fd49 fd02 .T.UHT..OWH..I..
0000030: 003b 5ffd 5f0f 0000 00 .;_._....
res.end(node.log, 'base64');
instead of
res.send(log);
Where node.log is the raw base64 encoded String and log was a Buffer that had decoded that string.
Bearing in mind I am using Node v0.10.38.
I ended up following the function call chain.
// I call
res.send(log);
// ExpressJS calls on http.ServerResponse
this.end(chunk, encoding); // chunk = Buffer, encoding = undefined
// NodeJS http.ServerResponse calls
res.inject(string);
At this point NodeJS appears to be treating the data as a string, which is where the buffer contents were being mangled.
This behaviour was different when the 'Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8' header was not present, a different end(chunk, encoding) function was being called in this case, not using res.inject and not mangling the Buffer data.
I am not entirely sure where the content negotiation is happening and what is swapping in the different res.end functions, whether this is NodeJS or ExpressJS, but it would be nice to be able to control this content negotiation in some simple way.

Ajax GET request of URL fails, but hurl.it GET request of same URL works. What gives?

I'm trying to access the contents of a web page as part of the functionality of a javascript Chrome extension I'm creating and have encountered a bit of a puzzle.
When accessing the resource as part of an Ajax GET request in my extension's script, the resource does not correctly load. When I attempt the same GET request through hurl.it (with follow-redirects on and no header modifications), the data loads. It also works if I just enter the URL in my browser.
I can't figure out what hurl.it is doing differently in its request and how to fix my own.
The URL I'm trying to query is in the form of (with 'Hello' as an example search):
http://overdrive.dclibrary.org/BANGSearch.dll?Type=FullText&PerPage=24&URL=SearchResults.htm&Sort=SortBy%3DRelevancy&FullTextField=All&FullTextCriteria=Hello&x=0&y=0%22
This would appear in my script like:
var get_url = "http://overdrive.dclibrary.org/BANGSearch.dll?Type=FullText&PerPage=24&URL=SearchResults.htm&Sort=SortBy%3DRelevancy&FullTextField=All&FullTextCriteria=Hello&x=0&y=0"
$.get(get_url,function(data){
console.log(data);
});
To be clear, the above code will output HTML to the console. It just doesn't include any of the results from the search query.
Thanks in advance!
Edit: My extension has been given permission to access anything in the dclibrary.org domain.
There's a redirect in the response, which can break a browser. HTTP/1.1 302 Moved Temporarily works in chrome though. why is it returning a 302 response? With ASP.net/IIS sometimes unauthorized (not logged in) users are redirected with a 302 response. This is bad behavior, if login is needed, it should issue a 403, but obviously this is not the intent.
Use Fiddler to debug full headers (or chrome dev tools).
HTTP/1.1 302 Moved Temporarily
Cache-Control: private
Content-Type: text/html; charset=utf-8
Location: http://overdrive.dclibrary.org/4F2C56FC-9659-4640-9E00-9433B3CC9A7F/10/50/en/SearchResults.htm?searchid=38432201s&sortby=Relevancy
Server: Microsoft-IIS/7.0
X-AspNetMvc-Version: 4.0
X-AspNet-Version: 4.0.30319
Set-Cookie: Session=4F2C56FC-9659-4640-9E00-9433B3CC9A7F|0|0|50|en; expires=Tue, 01-Jan-2030 00:00:00 GMT; path=/; HttpOnly
X-Powered-By: ASP.NET
Date: Fri, 06 Mar 2015 20:53:31 GMT
Content-Length: 12484
Connection: keep-alive
Keep-Alive: 60
Via: HTTP/1.1 proxy2444
<html><head><script type="text/javascript">window.NREUM||(NREUM={});NREUM.info = {"beacon":"bam.nr-data.net","errorBeacon":"bam.nr-data.net","licenseKey":"ad9513338b","applicationID":"2495152,2495041","transactionName":"MVBbNkcACBADUU1QWggafQ1BLwMXIV1XTUcJWVUHR04kAgxVaVhSAxpwDFEEHg==","queueTime":0,"applicationTime":61,"ttGuid":"1260BAD900521DF4","agent":"js-agent.newrelic.com/nr-593.min.js"}</script><script type="text/javascript">(window.NREUM||(NREUM={})).loader_config={xpid:"VwIAVVRbGwEEXVRTAgM="};window.NREUM||(NREUM={}),__nr_require=function(t,e,n){function r(n){if(!e[n]){var o=e[n]={exports:{}};t[n][0].call(o.exports,function(e){var o=t[n][1][e];return r(o?o:e)},o,o.exports)}return e[n].exports}if("function"==typeof __nr_require)return __nr_require;for(var o=0;o<n.length;o++)r(n[o]);return r}({QJf3ax:[function(t,e){function n(t){function e(e,n,a){t&&t(e,n,a),a||(a={});for(var c=s(e),f=c.length,u=i(a,o,r),d=0;f>d;d++)c[d].apply(u,n);return u}function a(t,e){f[t]=s(t).concat(e)}function s(t){return f[t]||[]}function c(){return n(e)}var f={};return{on:a,emit:e,create:c,listeners:s,_events:f}}function r(){return{}}var o="nr#context",i=t("gos");e.exports=n()},{gos:"7eSDFh"}],ee:[function(t,e){e.exports=t("QJf3ax")},{}],3:[function(t){function e(t){try{i.console&&console.log(t)}catch(e){}}var n,r=t("ee"),o=t(1),i={};try{n=localStorage.getItem("__nr_flags").split(","),console&&"function"==typeof console.log&&(i.console=!0,-1!==n.indexOf("dev")&&(i.dev=!0),-1!==n.indexOf("nr_dev")&&(i.nrDev=!0))}catch(a){}i.nrDev&&r.on("internal-error",function(t){e(t.stack)}),i.dev&&r.on("fn-err",function(t,n,r){e(r.stack)}),i.dev&&(e("NR AGENT IN DEVELOPMENT MODE"),e("flags: "+o(i,function(t){return t}).join(", ")))},{1:20,ee:"QJf3ax"}],4:[function(t){function e(t,e,n,i,s){try{c?c-=1:r("err",[s||new UncaughtException(t,e,n)])}catch(f){try{r("ierr",[f,(new Date).getTime(),!0])}catch(u){}}return"function"==typeof a?a.apply(this,o(arguments)):!1}function UncaughtException(t,e,n){this.message=t||"Uncaught error with no additional information",this.sourceURL=e,this.line=n}function n(t){r("err",[t,(new Date).getTime()])}var r=t("handle"),o=t(6),i=t("ee"),a=window.onerror,s=!1,c=0;t("loader").features.err=!0,t(3),window.onerror=e;try{throw new Error}catch(f){"stack"in f&&(t(4),t(5),"addEventListener"in window&&t(1),window.XMLHttpRequest&&XMLHttpRequest.prototype&&XMLHttpRequest.prototype.addEventListener&&t(2),s=!0)}i.on("fn-start",function(){s&&(c+=1)}),i.on("fn-err",function(t,e,r){s&&(this.thrown=!0,n(r))}),i.on("fn-end",function(){s&&!this.thrown&&c>0&&(c-=1)}),i.on("internal-error",function(t){r("ierr",[t,(new Date).getTime(),!0])})},{1:5,2:8,3:3,4:7,5:6,6:21,ee:"QJf3ax",handle:"D5DuLP",loader:"G9z0Bl"}],5:[function(t,e){function n(t){i.inPlace(t,["addEventListener","removeEventListener"],"-",r)}function r(t){return t[1]}var o=(t(1),t("ee").create()),i=t(2)(o),a=t("gos");if(e.exports=o,n(window),"getPrototypeOf"in Object){for(var s=document;s&&!s.hasOwnProperty("addEventListener");)s=Object.getPrototypeOf(s);s&&n(s);for(var c=XMLHttpRequest.prototype;c&&!c.hasOwnProperty("addEventListener");)c=Object.getPrototypeOf(c);c&&n(c)}else XMLHttpRequest.prototype.hasOwnProperty("addEventListener")&&n(XMLHttpRequest.prototype);o.on("addEventListener-start",function(t){if(t[1]){var e=t[1];"function"==typeof e?this.wrapped=t[1]=a(e,"nr#wrapped",function(){return i(e,"fn-",null,e.name||"anonymous")}):"function"==typeof e.handleEvent&&i.inPlace(e,["handleEvent"],"fn-")}}),o.on("removeEventListener-start",function(t){var e=this.wrapped;e&&(t[1]=e)})},{1:21,2:22,ee:"QJf3ax",gos:"7eSDFh"}],6:[function(t,e){var n=(t(2),t("ee").create()),r=t(1)(n);e.exports=n,r.inPlace(window,["requestAnimationFrame","mozRequestAnimationFrame","webkitRequestAnimationFrame","msRequestAnimationFrame"],"raf-"),n.on("raf-start",function(t){t[0]=r(t[0],"fn-")})},{1:22,2:21,ee:"QJf3ax"}],7:[function(t,e){function n(t,e,n){var r=t[0];"string"==typeof r&&(r=new Function(r)),t[0]=o(r,"fn-",null,n)}var r=(t(2),t("ee").create()),o=t(1)(r);e.exports=r,o.inPlace(window,["setTimeout","setInterval","setImmediate"],"setTimer-"),r.on("setTimer-start",n)},{1:22,2:21,ee:"QJf3ax"}],8:[function(t,e){function n(){f.inPlace(this,p,"fn-")}function r(t,e){f.inPlace(e,["onreadystatechange"],"fn-")}function o(t,e){return e}function i(t,e){for(var n in t)e[n]=t[n];return e}var a=t("ee").create(),s=t(1),c=t(2),f=c(a),u=c(s),d=window.XMLHttpRequest,p=["onload","onerror","onabort","onloadstart","onloadend","onprogress","ontimeout"];e.exports=a,window.XMLHttpRequest=function(t){var e=new d(t);try{a.emit("new-xhr",[],e),u.inPlace(e,["addEventListener","removeEventListener"],"-",function(t,e){return e}),e.addEventListener("readystatechange",n,!1)}catch(r){try{a.emit("internal-error",[r])}catch(o){}}return e},i(d,XMLHttpRequest),XMLHttpRequest.prototype=d.prototype,f.inPlace(XMLHttpRequest.prototype,["open","send"],"-xhr-",o),a.on("send-xhr-start",r),a.on("open-xhr-start",r)},{1:5,2:22,ee:"QJf3ax"}],9:[function(t){function e(t){if("string"==typeof t&&t.length)return t.length;if("object"!=typeof t)return void 0;if("undefined"!=typeof ArrayBuffer&&t instanceof ArrayBuffer&&t.byteLength)return t.byteLength;if("undefined"!=typeof Blob&&t instanceof Blob&&t.size)return t.size;if("undefined"!=typeof FormData&&t instanceof FormData)return void 0;try{return JSON.stringify(t).length}catch(e){return void 0}}function n(t){var n=this.params,r=this.metrics;if(!this.ended){this.ended=!0;for(var i=0;c>i;i++)t.removeEventListener(s[i],this.listener,!1);if(!n.aborted){if(r.duration=(new Date).getTime()-this.startTime,4===t.readyState){n.status=t.status;var a=t.responseType,f="arraybuffer"===a||"blob"===a||"json"===a?t.response:t.responseText,u=e(f);if(u&&(r.rxSize=u),this.sameOrigin){var d=t.getResponseHeader("X-NewRelic-App-Data");d&&(n.cat=d.split(", ").pop())}}else n.status=0;r.cbTime=this.cbTime,o("xhr",[n,r,this.startTime])}}}function r(t,e){var n=i(e),r=t.params;r.host=n.hostname+":"+n.port,r.pathname=n.pathname,t.sameOrigin=n.sameOrigin}if(window.XMLHttpRequest&&XMLHttpRequest.prototype&&XMLHttpRequest.prototype.addEventListener&&!/CriOS/.test(navigator.userAgent)){t("loader").features.xhr=!0;var o=t("handle"),i=t(2),a=t("ee"),s=["load","error","abort","timeout"],c=s.length,f=t(1);t(4),t(3),a.on("new-xhr",function(){this.totalCbs=0,this.called=0,this.cbTime=0,this.end=n,this.ended=!1,this.xhrGuids={}}),a.on("open-xhr-start",function(t){this.params={method:t[0]},r(this,t[1]),this.metrics={}}),a.on("open-xhr-end",function(t,e){"loader_config"in NREUM&&"xpid"in NREUM.loader_config&&this.sameOrigin&&e.setRequestHeader("X-NewRelic-ID",NREUM.loader_config.xpid)}),a.on("send-xhr-start",function(t,n){var r=this.metrics,o=t[0],i=this;if(r&&o){var f=e(o);f&&(r.txSize=f)}this.startTime=(new Date).getTime(),this.listener=function(t){try{"abort"===t.type&&(i.params.aborted=!0),("load"!==t.type||i.called===i.totalCbs&&(i.onloadCalled||"function"!=typeof n.onload))&&i.end(n)}catch(e){try{a.emit("internal-error",[e])}catch(r){}}};for(var u=0;c>u;u++)n.addEventListener(s[u],this.listener,!1)}),a.on("xhr-cb-time",function(t,e,n){this.cbTime+=t,e?this.onloadCalled=!0:this.called+=1,this.called!==this.totalCbs||!this.onloadCalled&&"function"==typeof n.onload||this.end(n)}),a.on("xhr-load-added",function(t,e){var n=""+f(t)+!!e;this.xhrGuids&&!this.xhrGuids[n]&&(this.xhrGuids[n]=!0,this.totalCbs+=1)}),a.on("xhr-load-removed",function(t,e){var n=""+f(t)+!!e;this.xhrGuids&&this.xhrGuids[n]&&(delete this.xhrGuids[n],this.totalCbs-=1)}),a.on("addEventListener-end",function(t,e){e instanceof XMLHttpRequest&&"load"===t[0]&&a.emit("xhr-load-added",[t[1],t[2]],e)}),a.on("removeEventListener-end",function(t,e){e instanceof XMLHttpRequest&&"load"===t[0]&&a.emit("xhr-load-removed",[t[1],t[2]],e)}),a.on("fn-start",function(t,e,n){e instanceof XMLHttpRequest&&("onload"===n&&(this.onload=!0),("load"===(t[0]&&t[0].type)||this.onload)&&(this.xhrCbStart=(new Date).getTime()))}),a.on("fn-end",function(t,e){this.xhrCbStart&&a.emit("xhr-cb-time",[(new Date).getTime()-this.xhrCbStart,this.onload,e],e)})}},{1:"XL7HBI",2:10,3:8,4:5,ee:"QJf3ax",handle:"D5DuLP",loader:"G9z0Bl"}],10:[function(t,e){e.exports=function(t){var e=document.createElement("a"),n=window.location,r={};e.href=t,r.port=e.port;var o=e.href.split("://");return!r.port&&o[1]&&(r.port=o[1].split("/")[0].split("#").pop().split(":")[1]),r.port&&"0"!==r.port||(r.port="https"===o[0]?"443":"80"),r.hostname=e.hostname||n.hostname,r.pathname=e.pathname,r.protocol=o[0],"/"!==r.pathname.charAt(0)&&(r.pathname="/"+r.pathname),r.sameOrigin=!e.hostname||e.hostname===document.domain&&e.port===n.port&&e.protocol===n.protocol,r}},{}],11:[function(t,e){function n(t){return function(){r(t,[(new Date).getTime()].concat(i(arguments)))}}var r=t("handle"),o=t(1),i=t(2);"undefined"==typeof window.newrelic&&(newrelic=window.NREUM);var a=["setPageViewName","addPageAction","setCustomAttribute","finished","addToTrace","inlineHit","noticeError"];o(a,function(t,e){window.NREUM[e]=n("api-"+e)}),e.exports=window.NREUM},{1:20,2:21,handle:"D5DuLP"}],gos:[function(t,e){e.exports=t("7eSDFh")},{}],"7eSDFh":[function(t,e){function n(t,e,n){if(r.call(t,e))return t[e];var o=n();if(Object.defineProperty&&Object.keys)try{return Object.defineProperty(t,e,{value:o,writable:!0,enumerable:!1}),o}catch(i){}return t[e]=o,o}var r=Object.prototype.hasOwnProperty;e.exports=n},{}],D5DuLP:[function(t,e){function n(t,e,n){return r.listeners(t).length?r.emit(t,e,n):(o[t]||(o[t]=[]),void o[t].push(e))}var r=t("ee").create(),o={};e.exports=n,n.ee=r,r.q=o},{ee:"QJf3ax"}],handle:[function(t,e){e.exports=t("D5DuLP")},{}],XL7HBI:[function(t,e){function n(t){var e=typeof t;return!t||"object"!==e&&"function"!==e?-1:t===window?0:i(t,o,function(){return r++})}var r=1,o="nr#id",i=t("gos");e.exports=n},{gos:"7eSDFh"}],id:[function(t,e){e.exports=t("XL7HBI")},{}],loader:[function(t,e){e.exports=t("G9z0Bl")},{}],G9z0Bl:[function(t,e){function n(){var t=l.info=NREUM.info;if(t&&t.licenseKey&&t.applicationID&&f&&f.body){s(h,function(e,n){e in t||(t[e]=n)}),l.proto="https"===p.split(":")[0]||t.sslForHttp?"https://":"http://",a("mark",["onload",i()]);var e=f.createElement("script");e.src=l.proto+t.agent,f.body.appendChild(e)}}function r(){"complete"===f.readyState&&o()}function o(){a("mark",["domContent",i()])}function i(){return(new Date).getTime()}var a=t("handle"),s=t(1),c=(t(2),window),f=c.document,u="addEventListener",d="attachEvent",p=(""+location).split("?")[0],h={beacon:"bam.nr-data.net",errorBeacon:"bam.nr-data.net",agent:"js-agent.newrelic.com/nr-593.min.js"},l=e.exports={offset:i(),origin:p,features:{}};f[u]?(f[u]("DOMContentLoaded",o,!1),c[u]("load",n,!1)):(f[d]("onreadystatechange",r),c[d]("onload",n)),a("mark",["firstbyte",i()])},{1:20,2:11,handle:"D5DuLP"}],20:[function(t,e){function n(t,e){var n=[],o="",i=0;for(o in t)r.call(t,o)&&(n[i]=e(o,t[o]),i+=1);return n}var r=Object.prototype.hasOwnProperty;e.exports=n},{}],21:[function(t,e){function n(t,e,n){e||(e=0),"undefined"==typeof n&&(n=t?t.length:0);for(var r=-1,o=n-e||0,i=Array(0>o?0:o);++r<o;)i[r]=t[e+r];return i}e.exports=n},{}],22:[function(t,e){function n(t){return!(t&&"function"==typeof t&&t.apply&&!t[i])}var r=t("ee"),o=t(1),i="nr#wrapper",a=Object.prototype.hasOwnProperty;e.exports=function(t){function e(t,e,r,a){function nrWrapper(){var n,i,s,f;try{i=this,n=o(arguments),s=r&&r(n,i)||{}}catch(d){u([d,"",[n,i,a],s])}c(e+"start",[n,i,a],s);try{return f=t.apply(i,n)}catch(p){throw c(e+"err",[n,i,p],s),p}finally{c(e+"end",[n,i,f],s)}}return n(t)?t:(e||(e=""),nrWrapper[i]=!0,f(t,nrWrapper),nrWrapper)}function s(t,r,o,i){o||(o="");var a,s,c,f="-"===o.charAt(0);for(c=0;c<r.length;c++)s=r[c],a=t[s],n(a)||(t[s]=e(a,f?s+o:o,i,s,t))}function c(e,n,r){try{t.emit(e,n,r)}catch(o){u([o,e,n,r])}}function f(t,e){if(Object.defineProperty&&Object.keys)try{var n=Object.keys(t);return n.forEach(function(n){Object.defineProperty(e,n,{get:function(){return t[n]},set:function(e){return t[n]=e,e}})}),e}catch(r){u([r])}for(var o in t)a.call(t,o)&&(e[o]=t[o]);return e}function u(e){try{t.emit("internal-error",e)}catch(n){}}return t||(t=r),e.inPlace=s,e.flag=i,e}},{1:21,ee:"QJf3ax"}]},{},["G9z0Bl",4,9]);</script><title>Object moved</title></head><body>
<h2>Object moved to here.</h2>
</body></html>

Ajax response is gzip compressed - Prototype, Firefox can't handle it

I'm trying to query a web service (with JavaScript, prototype). The server responds with XML, but compresses it; headers are set appropriately.
Under Safari 4, everything is fine. The response is decompressed and JavaScript can deal with the data.
Under Firefox 3.5.8, no data is returned to JavaScript!
Code:
var req = new Ajax.Request(this.url, {
asynchronous: false,
contentType: 'text/xml',
method: 'post',
postBody: xmlString,
onSuccess: function(t) {
// debug, place response into textarea to show
$('responseText').value = t.responseText;
}
});
This is the response, as I trace it on the network:
HTTP/1.1 200 OK.
Date: Fri, 05 Mar 2010 14:10:51 GMT.
Server: Apache/2.2.9 (Debian) PHP/5.2.6-1+lenny6 with Suhosin-Patch.
X-Powered-By: PHP/5.2.6-1+lenny6.
Vary: Accept-Encoding.
Content-Encoding: gzip.
Content-Length: 2104.
Keep-Alive: timeout=15, max=100.
Connection: Keep-Alive.
Content-Type: text/xml.
.
............]s......W`.3...H&A.$.Q.^[.:....... (and so on ...)
Any idea why this is happening? What can I do about it?
I tried setting the 'Accept-Encoding' header in the request, can't get it working properly. Besides, the response can be rather large, meaning: it's good that it is compressed by the server.

Categories

Resources