I'm getting a JSON parsing error when I try to fetch data from a server endpoint.
It's the first time that Axios cannot decode the JSON response automatically.
Debugging my code, I've seen that Axios catch some unexpected character in the server response that makes the JSON not valid.
7F5
{
"message": "OK"
...cut
}
0
Error:
(node:14940) UnhandledPromiseRejectionWarning: SyntaxError: Unexpected token F in JSON at position 1
I suppose that could be a charset encoding problem.
Axios client configuration:
const pclClient = axios.create({
baseURL: "http://server/endpoint",
responseType: "json",
responseEncoding: "utf8",
headers: {
Accept: "application/json",
"Content-Type": "application/json",
charset: "utf-8"
}
});
Using tools like postman or Chrome Extension Advanced Request Client, the problem is not present.
Can someone help me?
The problem comes from transfer-encoding: chunked response header.
RFC 7230 tells that "A recipient MUST be able to parse and decode the chunked transfer
coding."
At the moment, Axios don't handle chunked responses (transfer-encoding chunked not handled for application/json)
To resolve this issue, I've made a chunk parser using regex to removing chunk's info.
const pclClient = axios.create({
baseURL: "http://server/",
responseType: "json",
headers: {
Accept: "application/json"
}
});
const chunksParser = body => {
return body
.replace(/^(\w{1,3})\r\n/, "") // replace header chunks info
.replace(/\r\n(\w{1,3})\r\n/, "") // replace in-body chunks info
.replace(/(\r\n0\r\n\r\n)$/, ""); // replace end chunks info
};
const getData = async () => {
response = await pclClient.get("data.json");
const { data } = response;
const body = chunksParser(data);
const json = JSON.parse(body);
return json;
};
I was looking for a built-in function inside Axios. I hope it will be available in the future.
Thank you for commenters that helped me to understand what was the problem.
Related
I am trying to make a low dependency JavaScript to show temperature of raspberry pi. The server sends a JSON as a response to get request and the Client shows a web page.
The server is working as intended,I have checked in browser and using postman
const { spawn } = require("child_process");
const http = require('http');
http.createServer((req, res) => {
if (req.url === '/') {
res.writeHead(200, { 'Content-Type': 'application/json; charset=utf-8' });
const temp = spawn('cat', ['/sys/class/thermal/thermal_zone0/temp']);
temp.stdout.on('data', function (data) {
data = data / 1000;
console.log('Temperature: ' + data + '°C');
res.end(JSON.stringify({"temp":data}));
});
temp.stderr.on('data', function (data) {
res.end(JSON.stringify({"temp":"Unavailable"}));
});
}
else {
res.writeHead(404, { 'Content-Type': 'application/json; charset=utf-8' });
res.end(JSON.stringify({"temp":"Unavailable"}));
}
}).listen((process.argv[2] || 64567), () => {
console.log('Server listening on http://localhost:' + (process.argv[2] || 64567));
});
This is the client side code
<body>
<script defer>
await fetch('http://localhost:64567',{mode: 'no-cors'}).then(res => JSON.parse(res)).then(temp => document.getElementById("value").innerHTML = temp.temp);
// SyntaxError: JSON.parse: unexpected character at line 1 column 2 of the JSON data
/*
await fetch('http://localhost:64567',{mode: 'no-cors'}).then(res => res.json()).then(temp => document.getElementById("value").innerHTML = temp.temp)
// SyntaxError: JSON.parse: unexpected end of data at line 1 column 1 of the JSON data
await fetch('https://api.npoint.io/d5a7160bab77dd869b82').then(res => res.json()).then(temp => document.getElementById("value").innerHTML = temp.temp)
// This one works
*/
</script>
<h3>Temperature:<span id="value"> </span> °C</h3>
</body>
but when I try to use fetch api in the client side, JSON.parse gives two different types of errors for different approaches, but when I use a publicly hosted JSON bin it works.
Expectation: Fetch JSON being fetched and parsed correctly.
Tried:
Two different approaches to parse JSON
Use a different host for JSON
res is a Response, not the text. Use res.json() to parse the body text as JSON.
fetch('http://localhost:64567').then(res => res.json())
So I figured it out, thanks to #Unmitigated , all I needed to do was set the CORS on the server side and remove mode:no-cors in the client side.
Adding headers like these, in the server was the fix.
const headers = {
'Access-Control-Allow-Origin': '*', /* #dev First, read about security */
'Access-Control-Allow-Methods': 'OPTIONS, GET',
'Access-Control-Max-Age': 2592000, // 30 days
'Content-Type': 'application/json; charset=utf-8'
/** add other headers as per requirement */
};
I'm trying to stream price data via HTTP (Don't know why they don't use websockets..) and I use axios to make normal REST API requests but I don't know how to handle 'Transfer Encoding': 'chunked' type of requests.
This code just hangs and doesn't produce any error so assume it's working but not able to process the response:
const { data } = await axios.get(`https://stream.example.com`, {headers:
{Authorization: `Bearer ${token}`, 'Content-Type': 'application/octet-
stream'}})
console.log(data) // execution hangs before reaching here
Appreciate your help.
WORKING SOLUTION:
As pointed out from the answer below, we need to add a responseType: stream as an axios option and also add an event listener on the response.
Working code:
const response = await axios.get(`https://stream.example.com`, {
headers: {Authorization: `Bearer ${token}`},
responseType: 'stream'
});
const stream = response.data
stream.on('data', data => {
data = data.toString()
console.log(data)
})
FYI, sending the content-type header for a GET request is meaningless. The content-type header applies to the BODY of the http request and there is no body for a GET request.
With the axios() library, if you want to get direct access to the response stream, you use the responseType option to tell Axios that you want access to the raw response stream:
const response = await axios.get('https://stream.example.com', {
headers: {Authorization: `Bearer ${token}`,
responseType: 'stream'
});
const stream = response.data;
stream.on('data', data => {
console.log(data);
});
stream.on('end', () => {
console.log("stream done");
});
Axios document reference here.
I am trying to upload a file from a react front end to a C# backend. I am using drop zone to get the file and then I call an api helper to post the file but I am getting different errors when I try different things. I am unsure exactly what the headers should be and exactly what I should send but I get two distinct errors. If I do not set the content-type I get 415 (Unsupported Media Type) error. If I do specify content type as multipart/form-data I get a 500 internal server error. I get the same error when the content-type is application/json. The url is being past in and I am certain it is correct. I am unsure if the file should be appended as file[0][0] as I have done or as file[0] as it is an array but I believe it should be the first. Any suggestions welcome :) Here is my api post helper code:
export const uploadAdminFile = (file, path, method = 'POST', resource =
config.defaultResource) => {
const url = createUrl(resource, path);
const data = new FormData();
data.append('file', file[0][0]);
data.append('filename', file[0][0].name);
const request = accessToken =>
fetch(
url,
{
method,
mode: 'cors',
withCredentials: true,
processData: false,
headers: {
Accept: 'application/json',
'Content-Type': 'application/json', //'multipart/form-data',
Authorization: `Bearer ${accessToken}`,
},
body: data,
})
.then(res => res.json())
.then(success => console.log('API HELPER: file upload success: ', success)
.catch(err => console.log('API HELPER: error during file upload: ', err)));
return sendRequest(request, resource);
};
Thanks for the help and suggestions, it turned out to be a backend issue but even still I learned a lot in the process. I will post my working code here in case anyone comes across this and finds it useful.
export const uploadAdminFile = (file, path, resource=config.defaultResource) => {
const url = createUrl(resource, path);
const formData = new FormData();
formData.append('file', file[0][0]);
formData.append('filename', file[0][0].name);
const request = accessToken =>
fetch(url,
{
method: 'POST',
headers: {
Accept: 'application/json',
Authorization: `Bearer ${accessToken}`,
},
body: formData,
});
return sendRequest(request, resource);
};
As mentioned, the file name does not need to be sent separately and count be omitted. I am indexing the file this way because I get it from dropzone as an array and I only want a single file (the first one in the array). I hope this helps someone else out and here is a link to the mdn fetch docs (good information) and a good article on using fetch and formData.
I'm invoking an authentication service via javascript fetch to get an access token. The service is a simple RESTful call. I can see the call is successful using fiddler (with a 200 response and json data). However the fetch response never seems to get invoked. Below is a snippet:
const AUTHBODY = `grant_type=password&username=${username}&password=${password}&scope=roles offline_access profile`
const AUTHHEADER = new Headers({'Content-Type': 'application/x-www-form-urlencoded'})
const CONFIG = {
method: 'POST',
headers: AUTHHEADER,
body: AUTHBODY
}
fetch('http://localhost:23461/connect/token', CONFIG).then(function(response) {
console.log('response = ' + response)
return response.json()
}).then(function(json) {
console.log('json data = ' + json)
return json
}).catch(function(error) {
console.log('error = ' + error)
})
When executing the fetch above none of the console.logs gets executed... seems to just hang. But fiddler tells otherwise. Any ideas?
You probably met with the CORS origin policy problem. To tackle this you need some rights to access the server side of your API. In particular, you need to add a line in the header of php or another server endpoint:
<?php
header('Access-Control-Allow-Origin: *');
//or
header('Access-Control-Allow-Origin: http://example.com');
// Reading JSON POST using PHP
$json = file_get_contents('php://input');
$jsonObj = json_decode($json);
// Use $jsonObj
print_r($jsonObj->message);
...
// End php
?>
Also, make sure NOT to have in the header of your server endpoint:
header("Access-Control-Allow-Credentials" : true);
Model of working fetch code with POST request is:
const data = {
message: 'We send a message to the backend with fetch()'
};
const endpoint = 'http://example.com/php/phpGetPost.php';
fetch(endpoint, {
method: 'POST',
body: JSON.stringify(data)
})
.then((resp) => resp.json())
.then(function(response) {
console.info('fetch()', response);
return response;
});
I'm attempting to make a batch http request (on the server with Meteor/HTTP) to gmail with the following:
batchGetMessages = (accessToken, ids) => {
let userId = 'me';
let url = `https://www.googleapis.com/batch`;
let boundary = `batch_message_request`;
let body = ``;
_.each(ids, (id) => {
body = `${body}
--${boundary}
Content-Type: application/http
GET /gmail/v1/users/${userId}/messages/${id.id}?format=metadata&fields=id%2Cpayload
`
});
body = `${body}
--${boundary}--`
let options = {
headers: {
'Authorization': `Bearer ${accessToken}`,
'Content-Type': `multipart/mixed; boundary="${boundary}"`,
},
content: body,
};
let data = HTTP.post(url, options);
console.log('data: ', data);
}
The body string winds up looking like this:
--batch_message_request
Content-Type: application/http
GET /gmail/v1/users/me/messages/15375be281102d3d?format=metadata&fields=id%2Cpayload
--batch_message_request
Content-Type: application/http
GET /gmail/v1/users/me/messages/15366f87db6bdfeb?format=metadata&fields=id%2Cpayload
--batch_message_request
Content-Type: application/http
GET /gmail/v1/users/me/messages/15365d62f152dea2?format=metadata&fields=id%2Cpayload
--batch_message_request--
My request always returns a 400 Bad Request error. I've checked similar questions but haven't been able to get this working yet:
Generating HTTP multipart body for file upload in JavaScript
Gmail REST api batch support for getting messages
Batch request - 400 bad request response
Any help or suggestions will be greatly appreciated. Thanks!
Well, the 400 BadRequest error seems to be caused by the newline whitespace in the string template. Removing the whitespace got everything working:
...
/*
the lack of whitespace is impotant in the folllowing string template:
*/
_.each(ids, (id) => {
body = `${body}
--${boundary}
Content-Type: application/http
GET /gmail/v1/users/${userId}/messages/${id.id}? format=metadata&fields=id%2Cpayload`
});
body = `${body}
--${boundary}--`;
...
There's probably a fancier way to maintain your code indentation and eliminate the whitespace, but I haven't looked for it yet.