json default charset API and browser display - javascript

My api returns following json.
HTTP/1.1 200 OK
Date: Fri, 30 Aug 2013 14:14:32 GMT
Server: Apache/2.2.22 (Ubuntu)
Content-Length: 431
Keep-Alive: timeout=5, max=100
Connection: Keep-Alive
Content-Type: application/json
{
"member": [
{
"city": "Li\u00e8ge",
"country": "Belgi\u00eb",
}
]
}
following the specs ( JSON specs )
JSON text SHALL be encoded in Unicode. The default encoding is UTF-8.
So the Contect-Type is not specifying the charset
But chrome or IE10 browsers are not displaying the è ( \u00e8 ) and ë ( \u00eb ) correct.
Is this because chrome or IE10 have a different default charset . I need to give the API to a customer and he will also see the strings "Belgi\u00eb"and "Li\u00e8ge". Am i violating the specs somehow or am i doing it right and is the browser beeing stupid?

You can specify the charset if you want:
contentType : 'application/json; charset=utf-8'
And also use utf8_decode (or similar) at the other end just ot make sure.
Works here.

Related

API Data Returning Unicode Characters in Console

I am facing a rather confusing problem since the last two days. I am working on a document management system, that uses an API that pulls in data from SOLR. The data is in tune of around ~15Mbs, and pulls records of more than 4000+ documents. The API has response in this format -
{
"documents": [
{
id: 123,
some_field: "abcd",
some_other_field: "abcdef"
},
{
id: 124,
some_field: "abcd1",
some_other_field: "abcdef1"
}
]
}
Everything works fine in browser. If I hit the endpoint in Chrome or Firefox browser, it gives me the correct output and I am able to see the JSON output.
However, if I try hitting the same API endpoint with a Java or JS code - the response code is 200, but the output in console (Terminal or Eclipse) shows unicode characters like \u0089 \u0078 U+0080 - all the output comes in this way, and since there are around 4000+ records being fetched by the API, the console kinda fills with all of these unicode characters.
The only difference that I see between the requests made from browser and the code is that in browser I can see Content-Encoding : gzip, while I cannot find this header from the code that I written . For eg - in JS code, through Chakram framework, I can check
expect(response).to.be.encoded.with.gzip
mentioned here. However, this returns a failure stating expected undefined to match gzip
What am I missing here? Is this something related to encoding/decoding or something entirely different?
Edit 1 : The Response Headers as seen in Network tab of Chrome :
cache-control: max-age=0, private, must-revalidate, max-age=315360000
content-encoding: gzip
content-type: application/json; charset=utf-8
date: Tue, 22 May 2018 06:07:26 GMT
etag: "a07eb7c1eef4ab97699afc8d61fb9c5d"
expires: Fri, 19 May 2028 06:07:26 GMT
p3p: CP="NON CUR OTPi OUR NOR UNI"
server: Apache
Set-Cookie : some_cookie
status: 200 OK
strict-transport-security:
transfer-encoding: chunked
vary: Accept-Encoding
x-content-type-options: nosniff
x-frame-options: SAMEORIGIN
x-request-id: abceefr4-1234-acds-100b-d2bef2413r47
x-runtime: 3.213943
x-ua-compatible: chrome=1
x-xss-protection: 1; mode=block
The Request Headers as seen in Network tab of Chrome
Accept: application/json, text/plain, */*
Accept-Encoding: gzip, deflate
Accept-Language: en-US,en;q=0.9
Connection: keep-alive
Cookie: some_cookie
Host: abcd.bcd.com
IV_USER: demouser123
IV_USER_L: demouser123
MAIL: demouser#f.com
PERSON_ID: 123
Referer: http://abcd.bcd.com
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/66.0.3359.181 Safari/537.36
X-CSRF-TOKEN: some_csrf_token
Edit 2 : The tests that I am using
describe('Hits required API',()=>{
before(()=>{
return chakram.wait(api_response = chakram.get(url,options));
});
it('displayes response',()=>{
return api_response.then((t_resp)=>{
console.log(JSON.stringify(t_resp));
expect(t_resp).to.have.header('Content-Encoding','gzip');
});
});
This has nothing to do with encoding. The web server in general compresses to gzip to save the bandwidth since its redundant to transfer the whole 15MB file as is refer this article for more about gZip and the its working ( https://betterexplained.com/articles/how-to-optimize-your-site-with-gzip-compression/ ). So where does it went wrong and how it worked in chrome is pretty simple chrome has an inbuilt unicode parser(even an HTML parser) in its devTools which can show you the parsed content rather showing you the wiered text (same can be seen in response tab next to preview tab). why you see wierd text is that you are stingfying the response which will escape special character if any console.log(JSON.stringify(t_resp));. You cannot use something like console.log("response", t_resp); without stringifying in terminal since the terminal doesn't have a JSON or an unicode parser it just prints in text. try removing that console since stringifying a 15mb file is a costly process.
Edit 1:-
if you still want to output in the console here whats to be done.
Since NODE cannot decode gzip by default directly (not with chakram, its just a APItesting platform) you can use zlib to do this. Please find the example snippet
const zlib = require('zlib');
describe('Hits required API',()=>{
before(()=>{
return chakram.wait(api_response = chakram.get(url,options));
});
it('displayes response',()=>{
return api_response.then((t_resp)=>{
zlib.gunzip(t_resp, function(err, dezipped) {
console.log(dezipped);
});
});
});
Try with console.dir to display your values
describe('Hits required API',()=>{
before(()=>{
return chakram.wait(api_response = chakram.get(url,options));
});
it('displayes response',()=>{
return api_response.then((t_resp)=>{
console.dir(t_resp, { depth: null });
});
});
Console.dir

Get data with $resource from Flask API in AngularjJS

I'm using a Flask rest api where I've a method that I call from AngularJS with $resource factory.
I'm using now data to try it, this is the simplified code in the server:
return jsonify({'teacher': 'Tom'})
And in AngularJS, in a controller where I want to get data:
var teacher = $resource('http://localhost:8001/entities/teacher/:id');
teacher.get({id: 1}).$promise.then(function(teacher) {
// success
$scope.teacher = teacher;
}, function(errResponse) {
// fail
console.log('fail')
console.log(errResponse)
});
And this is the log in web browser console:
fail
Object { data: null, status: -1, headers: fd/<(), config: Object, statusText: "" }
But however I can see the request in network console with data in response with 200 status code, because of this I think that I don't know how read this data from $resource response, I've search info about this but I don't know how if I can see the data the resource give mi a fail error.
Any idea?
I know that I should to have this code in a service in another file, it's only a sample to find the solution.
Thanks!
Update:
I think that the problem is in the headers that I received from the server. If I try to use $http with a url that I found as example this work perfect, but when I try the same with my own server this fail even though I can see the response (and it's fine in the network console and it's fine with curl test) I think that the problem is in the headers, maybe $http or $resource needs a specific headers. I' ve updated the original question, I hope that someone can help me. I'm sure that the response is application/json.
This is the test url:
http://jsonplaceholder.typicode.com/users
and the header that is returned is:
Cache-Control
public, max-age=14400
Content-Encoding
gzip
Content-Type
application/json; charset=utf-8
Date
Mon, 03 Oct 2016 07:43:11 GMT
Etag
W/"160d-MxiAGkI3ZBrjm0xiEDfwqw"
Expires
Mon, 03 Oct 2016 11:43:11 GMT
Pragma
no-cache
Server
cloudflare-nginx
Vary
Accept-Encoding
Via
1.1 vegur
X-Firefox-Spdy
h2
X-Powered-By
Express
access-control-allow-credentials
true
cf-cache-status
HIT
cf-ray
2ebec303f9f72f7d-MAD
x-content-type-options
nosniff
And my header is:
Cache-Control
no-cache
Content-Length
35
Content-Type
application/json
Date
Mon, 03 Oct 2016 08:26:00 GMT
Expires
Fri, 01 Jan 1990 00:00:00 GMT
Link
Server
Development/2.0
I'm using a Google App Engine developer Server (I don't know if this is relevant).

JSON response in Firefox

I'm making a request in my JS and getting back a json response from the server just fine in Chrome and IE, but for some reason the response I get from firefox isn't working. It looks like it isn't being encoded with UTF-8, but I know that it is from the server and all of my files are as well. The response I get in FF looks like
incoming Text ��[�{�"�m�e�t�a�d...
(I cut of the vast majority of it, but there is that question mark in between every character). The console then says JSON.parse: unexpected character at line 1 column 1 of the JSON data"
My .ajax call is as follows:
var locationURL = "https://myData.com/Metadata.json";
$.support.cors = true;
var jsonData = $.ajax({
url: locationURL,
method: "GET",
dataType: "json",
cache: false,
});
The response header I get back (from firebug) is:
Accept-Ranges: bytes
Access-Control-Allow-Orig...: *
Connection: Keep-Alive
Content-Length: 81950
Content-Type: application/json; charset=utf-8
Date: Mon, 10 Aug 2015 14:03:51 GMT
Etag: "c80b2-1401e-51c80111c1d00"
HSID: Q01-21C D=601
Keep-Alive: timeout=5, max=100
Last-Modified: Tue, 04 Aug 2015 18:00:52 GMT
Server: Apache/2.4.10 (Unix) OpenSSL/1.0.1j
Again, in Chrome and IE the response I'm getting is perfectly valid JSON, without all those extra characters. I see in the console that chrome makes the request to the URL as I have it typed in, but in the firefox console it adds ?_=1439215432794 to the end of the URL, might this have something to do with it? How do I get it to not add that? I put a # at the end of the URL when saving it inside locationURL, but that didn't make a difference. I also tried Jquery's getJSON instead of .ajax but that also didn't help. Any help or direction would be appreciated! Thanks.

Express JS 4.0, serve binary data, request Accept header changes output

Thanks in advance.
Short:
Express JS 4.0 alters the output data, due to the Accept headers in the request.
Is there a way for me to override this behaviour, and just write the same data regardless of the request headers.
When Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8 is present output is changed.
Is there a way I can ignore, remove, override these headers.
Long (probably tl;dr):
I am trying to serve binary data from a Node/ExpressJS app.
I am storing a compressed log file (plain/text), that has been gzipped, base64 encoded and sent to my server app, where it is being stored in a mongo database using mongoose. I know this is probably not optimal, but is currently a necessary evil. This is working fine.
$(gzip --stdout /var/log/cloud-init-script.log | base64 --wrap=0)
Is being used to compress and base64 the data, before it is sent with other data as part of a json post.
The problem occurs when I attempt to retrieve, decode the base64 encoded string and send to the browser as a binary gzip file.
// node, referring to the machine the log came from
var log = new Buffer(node.log, 'base64');
res.setHeader('Content-Disposition', 'attachment; filename=' + node.name + "-log.gz");
res.setHeader('Content-Type', 'application/x-gzip');
res.setHeader('Content-Length', log.length);
console.log(log.toString('hex'));
// res.end(log, 'binary'); I tried this hoping I could by pass, some content-negotiation
res.send(log);
I had this working when using ExpressJS 3.0 using res.send.
But when I updated to ExpressJS 4.0 the downloaded data, ceased to extract properly. The data being pulled down seemingly corrupt somehow.
I started to try and fix this by comparing the downloaded file and the source file in hexidecimal output using xxd or od and found that the downloaded file was different to the source. I also dumped the hex of the NodeJS Buffer just before it is sent to the client to console, and this matches the source.
I have been banging my head against this issued for nearly a day now, and have suspected that NodeJS might be doing something funky with character encoding (UTF-8 v. Buffer v. UTF16 Strings), OS endianess.
Eventually finding none of this the be problem, I had assumed NodeJS had always been outputting the wrong data to the browser, which was correct, but it wasn't "Always" outputting the wrong data.
I had a break through, when I did a curl request to the endpoint, and the data came through as expected (matching the source), I then added the request headers that were sent with my browser requests, and got back the mangled data.
Actual log file:
I'm a log file
Good Request:
> User-Agent: curl/7.37.1
> Host: 127.0.0.1:9000
> Accept: */*
>
< HTTP/1.1 200 OK
< X-Powered-By: Express
< Last-Modified: Tue, 26 May 2015 11:47:46 GMT
< Content-Description: File Transfer
< Content-Disposition: attachment; filename=test-log.gz
< Content-Type: application/x-gzip
< Content-Transfer-Encoding: binary
< Content-Length: 57
< Date: Tue, 26 May 2015 11:47:46 GMT
< Connection: keep-alive
0000000: 1f8b 0808 0256 6455 0003 636c 6f75 642d .....VdU..cloud-
0000010: 696e 6974 2d73 6372 6970 742e 6c6f 6700 init-script.log.
0000020: f354 cf55 4854 c8c9 4f57 48cb cc49 e502 .T.UHT..OWH..I..
0000030: 003b 5ff5 5f0f 0000 00 .;_._....
Bad Request:
> Host: localhost:9000
> Connection: keep-alive
> Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
> User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/43.0.2357.65 Safari/537.36
> Referer: http://localhost:9000/nodes?query=environment%3D5549b6cbdc023b5e26fe6bd4%20type%3Dnat
> Accept-Language: en-US,en;q=0.8
>
< HTTP/1.1 200 OK
< X-Powered-By: Express
< Last-Modified: Tue, 26 May 2015 11:47:00 GMT
< Content-Description: File Transfer
< Content-Disposition: attachment; filename=test-log.gz
< Content-Type: application/x-gzip
< Content-Transfer-Encoding: binary
< content-length: 57
< Date: Tue, 26 May 2015 11:47:00 GMT
< Connection: keep-alive
0000000: 1ffd 0808 0256 6455 0003 636c 6f75 642d .....VdU..cloud-
0000010: 696e 6974 2d73 6372 6970 742e 6c6f 6700 init-script.log.
0000020: fd54 fd55 4854 fdfd 4f57 48fd fd49 fd02 .T.UHT..OWH..I..
0000030: 003b 5ffd 5f0f 0000 00 .;_._....
res.end(node.log, 'base64');
instead of
res.send(log);
Where node.log is the raw base64 encoded String and log was a Buffer that had decoded that string.
Bearing in mind I am using Node v0.10.38.
I ended up following the function call chain.
// I call
res.send(log);
// ExpressJS calls on http.ServerResponse
this.end(chunk, encoding); // chunk = Buffer, encoding = undefined
// NodeJS http.ServerResponse calls
res.inject(string);
At this point NodeJS appears to be treating the data as a string, which is where the buffer contents were being mangled.
This behaviour was different when the 'Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8' header was not present, a different end(chunk, encoding) function was being called in this case, not using res.inject and not mangling the Buffer data.
I am not entirely sure where the content negotiation is happening and what is swapping in the different res.end functions, whether this is NodeJS or ExpressJS, but it would be nice to be able to control this content negotiation in some simple way.

Ajax response is gzip compressed - Prototype, Firefox can't handle it

I'm trying to query a web service (with JavaScript, prototype). The server responds with XML, but compresses it; headers are set appropriately.
Under Safari 4, everything is fine. The response is decompressed and JavaScript can deal with the data.
Under Firefox 3.5.8, no data is returned to JavaScript!
Code:
var req = new Ajax.Request(this.url, {
asynchronous: false,
contentType: 'text/xml',
method: 'post',
postBody: xmlString,
onSuccess: function(t) {
// debug, place response into textarea to show
$('responseText').value = t.responseText;
}
});
This is the response, as I trace it on the network:
HTTP/1.1 200 OK.
Date: Fri, 05 Mar 2010 14:10:51 GMT.
Server: Apache/2.2.9 (Debian) PHP/5.2.6-1+lenny6 with Suhosin-Patch.
X-Powered-By: PHP/5.2.6-1+lenny6.
Vary: Accept-Encoding.
Content-Encoding: gzip.
Content-Length: 2104.
Keep-Alive: timeout=15, max=100.
Connection: Keep-Alive.
Content-Type: text/xml.
.
............]s......W`.3...H&A.$.Q.^[.:....... (and so on ...)
Any idea why this is happening? What can I do about it?
I tried setting the 'Accept-Encoding' header in the request, can't get it working properly. Besides, the response can be rather large, meaning: it's good that it is compressed by the server.

Categories

Resources