JavaScript Fetch: characters with encoding issues - javascript

I'm trying to use Fetch to bring some data into the screen, however some of the characters ares showing a weird � sign which I believe has something to do with converting special chars.
When debugging on the server side or if I call the servlet on my browser, the problem doesn't happen, so I believe the issue is with my JavaScript. See the code below:
var myHeaders = new Headers();
myHeaders.append('Content-Type','text/plain; charset=UTF-8');
fetch('getrastreiojadlog?cod=10082551688295', myHeaders)
.then(function (response) {
return response.text();
})
.then(function (resp) {
console.log(resp);
});
I think it is probably some detail, but I haven't managed to find out what is happening. So any tips are welcome
Thx

The response's text() function always decodes the payload as utf-8.
If you want the text in other charset you may use TextDecoder to convert the response buffer (NOT the text) into a decoded text with chosen charset.
Using your example it should be:
var myHeaders = new Headers();
myHeaders.append('Content-Type','text/plain; charset=UTF-8');
fetch('getrastreiojadlog?cod=10082551688295', myHeaders)
.then(function (response) {
return response.arrayBuffer();
})
.then(function (buffer) {
const decoder = new TextDecoder('iso-8859-1');
const text = decoder.decode(buffer);
console.log(text);
});
Notice that I'm using iso-8859-1 as decoder.
Credits: Schneide Blog

Maybe your server isn't returning an utf-8 encoded response, try to find which charset is used and then modify it in call headers.
Maybe ISO-8859-1 :
myHeaders.append('Content-Type','text/plain; charset=ISO-8859-1');

As it turns out, the problem was in how ther servlet was serving the data without explicitly informing the enconding type on the response.
By adding the following line in the Java servlet:
response.setContentType("text/html;charset=UTF-8");
it was possible got get the characters in the right format.

Related

Reactjs | How to make boundary static when content type is multipart/form-data

We are facing one issue while we are making file upload post call using React-JS, in dev-tools under form data we are getting some browser generated boundary.
------WebKitFormBoundarypSTl3xdAHAJgTN8A
And because of this random boundary we are getting issue while we are making call to a third party api.
Is there any way to make this boundary some fixed value. Something like this :
----somefixedvalue.
Here is the js code:
function doupload() {
let data = document.getElementById("file").files[0];
console.log('doupload',data);
let formData = new FormData();
formData.append("file", data);
fetch('http://localhost:8081/upload/multipart',
{
method:'POST',
body: formData,
headers: {
'Content-Type': 'multipart/form-data; boundary=----somefixedboundary'
}
}).then(res => {
for(const header of res.headers){
console.log(`resHeaderName: ${header[0]}, Value:${header[1]}`);
}
});
alert('your file has been uploaded');
location.reload();
};
can someone help me to solve this? I'm quite confused here as i have given content type along with some static boundary but no luck.
You can convert a file to the binary string and prepare the body by yourself instead of using FormData as it shown in this answer.
But keep in mind that if this ----somefixedvalue substring appears in the file it will be considered a boundary causing body parsing error on receiver side. In case of FormData, browser should take care of it and prevent this.

Sending and recieving plain text instead of an object with node http

I'm incredibly new at trying to do anything relating to webservers and the like and I've gotten stuck trying to send data from a javascript that runs on a website to a server that runs locally. I've been able to get them to communicate, but all the data I get on the server is always just "[object object]" instead of the string that I send from the browser.
This is how the server is looking currently, very bareboned:
http.createServer(function(request, response){
response.writeHead(200, {"Content-Type": "text/plain"});
console.log("request recieved: " + response + request);
}).listen(8001);
And this is what i send from the browser:
var url = "http://localhost:8001";
$.ajax({
url: url,
type: "POST",
data: "Hello friend",
contentType: "text/plain",
});
I've also tried something like this which results in the same problem:
var http = new XMLHttpRequest();
var sendData = "HELLO";
http.open("POST", "http://localhost:8001", true);
http.setRequestHeader("Content-type", "text/plain");
http.send(sendData);
I've also tried to use JSON.stringify on the server-side to try to get the string, but this returns an error stating that it cannot be used on a 'circular object'.
I'm sorry if this question is really stupid, as I said, I'm a complete beginner and this is the first time I've tried to do something remotely similar. I've tried researching what to do differently, but after trying countless options I felt that I needed to ask here instead.
Thanks in advance! And if any more information is needed I'd be happy to try to expand on the issue!
When you use concatenation (+) operator then javascript will convert object to string. Default String representation of object is [object object]. That's why it's printing [object object].
http.createServer(function(request, response){
response.writeHead(200, {"Content-Type": "text/plain"});
console.log("request recieved: " , response , request);
}).listen(8001);
And if you want to get the body then you have the listen to data event and consume the data.
http
.createServer(function(request, response) {
let body = "";
request.on("data", chunk => {
body += chunk.toString(); // convert Buffer to string
});
request.on("end", () => {
console.log(body);
response.end('ok');
});
})
.listen(8001);
I will recommend to use express package to avoid all boilerplate code.
You can access the body with request.body. You can't JSON.stringify the whole request because, as you saw, it holds ciruclar deps, but you can do so with the request.body.

Is there a difference between text/plain and string?

I am trying to send a Put request to an older Java back end. The path on the back end is
#PUT
#Path("/foo/bar")
#Consumes("text/plain")
public String someFunction(String ExpectedArgument){
//Unrelated logic
}
I'm trying to send a string from the front end using Javascript and Axios.
let someString = 'Example String'
axios.put('/foo/bar',someString).then(resp=>console.log(resp))
Unfortunately, when I try to do this, I'm receiving an HTTP 415 error for bad content type. Reviewing other, successful put requests that I've made, the only difference I've found is that this one has the "#Consumes("text/plain")" line in it. I can only conclude that there's some difference between what the java is expecting as text/plain and what I'm providing with a javascript string.
I would like to know what about my string is causing it to be rejected and how I can edit my code so that the back end will accept it.
const headers = {
'Content-Type': 'text/plain',
}
let someString = 'Example String'
axios.put('/foo/bar', someString, {
headers: headers
}).then(resp=>console.log(resp))

How can I compress JSON and uncompress on PHP?

BTW, this is not a duplicate, I'm not trying to compress in PHP, I'm trying to compress Client Side and uncompress in PHP.
I'm trying to compress a JSON array containing 5 base64 images and some text to my PHP api.
I have tried lz-string and pako and it appears to be sending the compressed payload as expected but I'm having issues decompressing on the PHP backend.
Typescript Gzip.
var payload = pako.gzip(JSON.stringify(data), { to: 'string' });
let headers = new Headers({ "Content-Encoding" : "gzip"});
headers.append('Content-Type', 'application/json; charset=x-user-defined-binary');
let requestOptions = new RequestOptions({ headers: headers });
var url = this.baseUrl + "/app/notice/issue/compressed";
return this.http.post(url, data, requestOptions).pipe(timeout(100000), map(res => { return res.json() }), catchError(err => { return throwError(err.message); }));
PHP
$input = file_get_contents("php://input");
$test = gzdecode($input);
echo $test;
Output
"
<div style="border:1px solid #990000;padding-left:20px;margin:0 0 10px 0;">
<h4>A PHP Error was encountered</h4>
<p>Severity: Warning</p>
<p>Message: gzdecode(): data error</p>
</div>"
If anyone can help in reducing the size of my JSON, that would be really helpful.
Thanks.
You PHP exapmle looks fine and should work.
I'm not sure about the Typescript part: binary string can be additionaly encoded (as UTF-8 maybe?) before being sent by browser or http.post() method.
Check sent Content-Length of the request and actual length of the binary string after compression. They obviously must be the same.
I suggest removing {to: 'string'} option in pako.gzip() method to get Uint8Array instead of string and use XMLHttpRequest instead of http.post()
See also:
Angular Post Binary Data
https://developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequest/Sending_and_Receiving_Binary_Data

Request returning unicode replacement character

Using the request module to load a webpage, I notice that for he UK pound symbol £ I sometimes get back the unicode replacement character \uFFFD.
An example URL that I'm parsing is this Amazon UK page: http://www.amazon.co.uk/gp/product/B00R3P1NSI/ref=s9_newr_gw_d38_g351_i2?pf_rd_m=A3P5ROKL5A1OLE&pf_rd_s=center-2&pf_rd_r=0Q529EEEZWKPCVQBRHT9&pf_rd_t=101&pf_rd_p=455333147&pf_rd_i=468294
I'm also using the iconv-lite module to decode using the charset returned in the response header:
request(urlEntry.url, function(err, response, html) {
const contType = response.headers['content-type'];
const charset = contType.substring(contType.indexOf('charset=') + 8, contType.length);
const encBody = iconv.decode(html, charset);
...
But this doesn't seem to be helping. I've also tried decoding the response HTML as UTF-8.
How can I avoid this Unicode replacement char?
Firstly, the Amazon webpage is encoded in ISO-8859-1, not UTF-8. This is what causes the Unicode replacement character. You can check this in the response headers. I used curl -i.
Secondly, the README for requests says:
encoding - Encoding to be used on setEncoding of response data. If
null, the body is returned as a Buffer. Anything else (including the
default value of undefined) will be passed as the encoding parameter
to toString() (meaning this is effectively utf8 by default).
It is UTF-8 by default... and (after a little experimentation) we find that it sadly it doesn't support ISO-8859-1. However, if we set the encoding to null we can then decode the resulting Buffer using iconv-lite.
Here is a sample program.
var request = require('request');
var iconvlite = require('iconv-lite');
var url = "http://www.amazon.co.uk/gp/product/B00R3P1NSI/ref=s9_newr_gw_d38_g351_i2?pf_rd_m=A3P5ROKL5A1OLE&pf_rd_s=center-2&pf_rd_r=0Q529EEEZWKPCVQBRHT9&pf_rd_t=101&pf_rd_p=455333147&pf_rd_i=468294";
request({url: url, encoding: null}, function (error, response, body) {
if (!error && response.statusCode == 200) {
var encoding = 'ISO-8859-1';
var content = iconvlite.decode(body, encoding);
console.log(content);
}
});
This question is somewhat related, and I used it whilst figuring this out:
http.get and ISO-8859-1 encoded responses

Categories

Resources