xhr caching values from getResponseHeader? - javascript

I'm running up against a very frustrating bug. I'm not exactly sure what is happening, but I think xhr is doing some kind of cache on the response headers.
My app is using devise_token_auth for the backend authentication service. We're using it with rotating access-tokens, and so I have written a function that runs after every request.
function storeAndGetResponseHeaders(xhr) {
const headersObj = {};
headerKeys.filter((key) => xhr.getResponseHeader(key))
.forEach((key) => {
headersObj[key] = xhr.getResponseHeader(key);
window.sessionStorage.setItem(key, xhr.getResponseHeader(key));
});
return headersObj;
}
where headerKeys is ['access-token', 'client', 'expiry', 'uid', 'token-type']. So any response that has these headers it should save them into sessionStorage and then return them in an object which gets stored within my AJAX service that I wrote and added to every request. We're using rxjs, and this service is just a thin wrapper around it. This is what RxAjax.ajax looks like.
ajax(urlOrRequest) {
const request = typeof urlOrRequest === 'string' ? { url: urlOrRequest } : urlOrRequest;
request.headers = Object.assign({}, this.headers, urlOrRequest.headers);
request.url = `${this.baseUrl}${request.url}`;
return Observable.ajax(request).map(this.afterRequest, this);
}
where this.headers is the stored headers from last request (or the loaded headers from sessionStorage). this.afterRequest is what sets the headers from the response xhr.
My problem is that I'm getting bad values into my headers object (specifically old access tokens). What I've noticed is that when I add a logging statement of headersObj after assignment, sometimes it will have old response headers from a past request. However when I look at the request itself in the dev console Network tab, it doesn't show any of the auth headers in the response headers ('access-token', 'client', etc...). This gets fixed for a little while if I do a hard refresh on the browser, but comes back seemingly inexplicably.
Note we're using rxjs to make our requests, which might be relevant (but I don't think it is the cause of this problem, as I'm trying to read the headers from the original xmlhttprequest object). Thanks!

As Barmar suggested in the comments, it was a caching issue. There may be a bug in the chrome console, where it wasn't showing the cached headers that were on the cached request. Hence even though it looked like there were no auth headers there really were.
It looks like if you're using jQuery you can add the option cache: false to the request in order to prevent caching. Because I'm not, the first thing I did was try adding ?cache=${new Date().toJSON} to each request, which successfully busted the cache and fixed my problem (that is what cache: false in jQuery does).
Our backend is in rails, and so I ended up adding
before_action :set_cache_headers
...
private
def set_cache_headers
response.headers["Cache-Control"] = "no-cache, no-store, must-revalidate"
end
to my application controller. Now no requests are cached by the browser. Not sure if this will be our long term solution or not

Related

Cloudflare Worker TypeError: One-time-use body

I'm trying to use a Cloudflare Worker to proxy a POST request to another server.
It is throwing a JS exception – by wrapping in a try/catch blog I've established that the error is:
TypeError: A request with a one-time-use body (it was initialized from a stream, not a buffer) encountered a redirect requiring the body to be retransmitted. To avoid this error in the future, construct this request from a buffer-like body initializer.
I would have thought this could be solved by simply copying the Response so that it's unused, like so:
return new Response(response.body, { headers: response.headers })
That's not working. What am I missing about streaming vs buffering here?
addEventListener('fetch', event => {
var url = new URL(event.request.url);
if (url.pathname.startsWith('/blog') || url.pathname === '/blog') {
if (reqType === 'POST') {
event.respondWith(handleBlogPost(event, url));
} else {
handleBlog(event, url);
}
} else {
event.respondWith(fetch(event.request));
}
})
async function handleBlog(event, url) {
var newBlog = "https://foo.com";
var originUrl = url.toString().replace(
'https://www.bar.com/blog', newBlog);
event.respondWith(fetch(originUrl));
}
async function handleBlogPost(event, url) {
try {
var newBlog = "https://foo.com";
var srcUrl = "https://www.bar.com/blog";
const init = {
method: 'POST',
headers: event.request.headers,
body: event.request.body
};
var originUrl = url.toString().replace( srcUrl, newBlog );
const response = await fetch(originUrl, init)
return new Response(response.body, { headers: response.headers })
} catch (err) {
// Display the error stack.
return new Response(err.stack || err)
}
}
A few issues here.
First, the error message is about the request body, not the response body.
By default, Request and Response objects received from the network have streaming bodies -- request.body and response.body both have type ReadableStream. When you forward them on, the body streams through -- chunks are received from the sender and forwarded to the eventual recipient without keeping a copy locally. Because no copies are kept, the stream can only be sent once.
The problem in your case, though, is that after streaming the request body to the origin server, the origin responded with a 301, 302, 307, or 308 redirect. These redirects require that the client re-transmit the exact same request to the new URL (unlike a 303 redirect, which directs the client to send a GET request to the new URL). But, Cloudflare Workers didn't keep a copy of the request body, so it can't send it again!
You'll notice this problem doesn't happen when you do fetch(event.request), even if the request is a POST. The reason is that event.request's redirect property is set to "manual", meaning that fetch() will not attempt to follow redirects automatically. Instead, fetch() in this case returns the 3xx redirect response itself and lets the application deal with it. If you return that response on to the client browser, the browser will take care of actually following the redirect.
However, in your worker, it appears fetch() is trying to follow the redirect automatically, and producing an error. The reason is that you didn't set the redirect property when you constructed your Request object:
const init = {
method: 'POST',
headers: event.request.headers,
body: event.request.body
};
// ...
await fetch(originUrl, init)
Since init.redirect wasn't set, fetch() uses the default behavior, which is the same as redirect = "automatic", i.e. fetch() tries to follow redirects. If you want fetch() to use manual redirect behavior, you could add redirect: "manual" to init. However, it looks like what you're really trying to do here is copy the whole request. In that case, you should just pass event.request in place of the init structure:
// Copy all properties from event.request *except* URL.
await fetch(originUrl, event.request);
This works because a Request has all of the fields that fetch()'s second parameter wants.
What if you want automatic redirects?
If you really do want fetch() to follow the redirect automatically, then you need to make sure that the request body is buffered rather than streamed, so that it can be sent twice. To do this, you will need to read the whole body into a string or ArrayBuffer, then use that, like:
const init = {
method: 'POST',
headers: event.request.headers,
// Buffer whole body so that it can be redirected later.
body: await event.request.arrayBuffer()
};
// ...
await fetch(originUrl, init)
A note on responses
I would have thought this could be solved by simply copying the Response so that it's unused, like so:
return new Response(response.body, { headers: response.headers })
As described above, the error you're seeing is not related to this code, but I wanted to comment on two issues here anyway to help out.
First, this line of code does not copy all properties of the response. For example, you're missing status and statusText. There are also some more-obscure properties that show up in certain situations (e.g. webSocket, a Cloudflare-specific extension to the spec).
Rather than try to list every property, I again recommend simply passing the old Response object itself as the options structure:
new Response(response.body, response)
The second issue is with your comment about copying. This code copies the Response's metadata, but does not copy the body. That is because response.body is a ReadableStream. This code initializes the new Respnose object to contain a reference to the same ReadableStream. Once anything reads from that stream, the stream is consumed for both Response objects.
Usually, this is fine, because usually, you only need one copy of the response. Typically you are just going to send it to the client. However, there are a few unusual cases where you might want to send the response to two different places. One example is when using the Cache API to cache a copy of the response. You could accomplish this by reading the whole Response into memory, like we did with requests above. However, for responses of non-trivial size, that could waste memory and add latency (you would have to wait for the entire response before any of it gets sent to the client).
Instead, what you really want to do in these unusual cases is "tee" the stream so that each chunk that comes in from the network is actually written to two different outputs (like the Unix tee command, which comes from the idea of a T junction in a pipe).
// ONLY use this when there are TWO destinations for the
// response body!
new Response(response.body.tee(), response)
Or, as a shortcut (when you don't need to modify any headers), you can write:
// ONLY use this when there are TWO destinations for the
// response body!
response.clone()
Confusingly, response.clone() does something completely different from new Response(response.body, response). response.clone() tees the response body, but keeps the Headers immutable (if they were immutable on the original). new Response(response.body, response) shares a reference to the same body stream, but clones the headers and makes them mutable. I personally find this pretty confusing, but it's what the Fetch API standard specifies.

How to send client side cookies (javascript) to server side (node.js) using Microsoft Bot Framework Directline API? [duplicate]

I am working on an internal web application at work. In IE10 the requests work fine, but in Chrome all the AJAX requests (which there are many) are sent using OPTIONS instead of whatever defined method I give it. Technically my requests are "cross domain." The site is served on localhost:6120 and the service I'm making AJAX requests to is on 57124. This closed jquery bug defines the issue, but not a real fix.
What can I do to use the proper http method in ajax requests?
Edit:
This is in the document load of every page:
jQuery.support.cors = true;
And every AJAX is built similarly:
var url = 'http://localhost:57124/My/Rest/Call';
$.ajax({
url: url,
dataType: "json",
data: json,
async: true,
cache: false,
timeout: 30000,
headers: { "x-li-format": "json", "X-UserName": userName },
success: function (data) {
// my success stuff
},
error: function (request, status, error) {
// my error stuff
},
type: "POST"
});
Chrome is preflighting the request to look for CORS headers. If the request is acceptable, it will then send the real request. If you're doing this cross-domain, you will simply have to deal with it or else find a way to make the request non-cross-domain. This is why the jQuery bug was closed as won't-fix. This is by design.
Unlike simple requests (discussed above), "preflighted" requests first
send an HTTP request by the OPTIONS method to the resource on the
other domain, in order to determine whether the actual request is safe
to send. Cross-site requests are preflighted like this since they may
have implications to user data. In particular, a request is
preflighted if:
It uses methods other than GET, HEAD or POST. Also, if POST is used to send request data with a Content-Type other than
application/x-www-form-urlencoded, multipart/form-data, or text/plain,
e.g. if the POST request sends an XML payload to the server using
application/xml or text/xml, then the request is preflighted.
It sets custom headers in the request (e.g. the request uses a header such as X-PINGOTHER)
Based on the fact that the request isn't sent on the default port 80/443 this Ajax call is automatically considered a cross-origin resource (CORS) request, which in other words means that the request automatically issues an OPTIONS request which checks for CORS headers on the server's/servlet's side.
This happens even if you set
crossOrigin: false;
or even if you ommit it.
The reason is simply that localhost != localhost:57124. Try sending it only to localhost without the port - it will fail, because the requested target won't be reachable, however notice that if the domain names are equal the request is sent without the OPTIONS request before POST.
I agree with Kevin B, the bug report says it all. It sounds like you are trying to make cross-domain ajax calls. If you're not familiar with the same origin policy you can start here: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Same_origin_policy_for_JavaScript.
If this is not intended to be a cross-domain ajax call, try making your target url relative and see if the problem goes away. If you're really desperate look into the JSONP, but beware, mayhem lurks. There really isn't much more we can do to help you.
If it is possible pass the params through regular GET/POST with a different name and let your server side code handles it.
I had a similar issue with my own proxy to bypass CORS and I got the same error of POST->OPTION in Chrome. It was the Authorization header in my case ("x-li-format" and "X-UserName" here in your case.) I ended up passing it in a dummy format (e.g. AuthorizatinJack in GET) and I changed the code for my proxy to turn that into a header when making the call to the destination. Here it is in PHP:
if (isset($_GET['AuthorizationJack'])) {
$request_headers[] = "Authorization: Basic ".$_GET['AuthorizationJack'];
}
In my case I'm calling an API hosted by AWS (API Gateway). The error happened when I tried to call the API from a domain other than the API own domain. Since I'm the API owner I enabled CORS for the test environment, as described in the Amazon Documentation.
In production this error will not happen, since the request and the api will be in the same domain.
I hope it helps!
As answered by #Dark Falcon, I simply dealt with it.
In my case, I am using node.js server, and creating a session if it does not exist. Since the OPTIONS method does not have the session details in it, it ended up creating a new session for every POST method request.
So in my app routine to create-session-if-not-exist, I just added a check to see if method is OPTIONS, and if so, just skip session creating part:
app.use(function(req, res, next) {
if (req.method !== "OPTIONS") {
if (req.session && req.session.id) {
// Session exists
next();
}else{
// Create session
next();
}
} else {
// If request method is OPTIONS, just skip this part and move to the next method.
next();
}
}
"preflighted" requests first send an HTTP request by the OPTIONS method to the resource on the other domain, in order to determine whether the actual request is safe to send. Cross-site requests
https://developer.mozilla.org/en-US/docs/Web/HTTP/Access_control_CORS
Consider using axios
axios.get( url,
{ headers: {"Content-Type": "application/json"} } ).then( res => {
if(res.data.error) {
} else {
doAnything( res.data )
}
}).catch(function (error) {
doAnythingError(error)
});
I had this issue using fetch and axios worked perfectly.
I've encountered a very similar issue. I spent almost half a day to understand why everything works correctly in Firefox and fails in Chrome. In my case it was because of duplicated (or maybe mistyped) fields in my request header.
Use fetch instead of XHR,then the request will not be prelighted even it's cross-domained.
$.ajax({
url: '###',
contentType: 'text/plain; charset=utf-8',
async: false,
xhrFields: {
withCredentials: true,
crossDomain: true,
Authorization: "Bearer ...."
},
method: 'POST',
data: JSON.stringify( request ),
success: function (data) {
console.log(data);
}
});
the contentType: 'text/plain; charset=utf-8', or just contentType: 'text/plain', works for me!
regards!!

Can’t access cross-origin response header from frontend JavaScript

I am building a simple web app using ReactJS and create-react-app.
I have a backend API set up on Heroku where I can make POST requests. Everything works fine, except:
When I make a POST request using fetch API, the response is 100% correct but it only gives me 2 standard headers. I want to get my custom header. I have added expose header in my response and here's the plot twist: When I view the headers from Chrome Inspection Tool or Postman (API tool), it shows all the headers, including my custom one. Here is the fetch code I'm using -
fetch(theUrl, {
method: 'POST',
body: JSON.stringify({
"placeholder": "placeholder"
})
})
.then(function(res) {
console.log(res.headers.get('CUSTOM_HEADER_NAME'));
})
If it makes any difference, this fetch method is called from a function outside the main body of the ReactJS component.
The name of the custom header is Image-Identification-Path, and the header in my response header is Access-Control-Expose-Headers for Image-Identification-Path.
Summary: How do I get my custom header using fetch?
You must configure the server to which the request is sent, such that its response has an Access-Control-Expose-Headers header that has the name of your custom response header.
Otherwise, if your browser doesn’t see the name of your custom header in that Access-Control-Expose-Headers header, it won’t let you access the value of your custom header.
In such a case it’s expected that you’d still be able to see the custom header if you look at the response in Postman or even in your browser devtools.
But just because the browser gets the custom header in the response doesn’t mean the browser will expose it to your frontend JavaScript code.
For cross-origin requests, browsers will only expose that custom response header to your frontend code if that header name is in the Access-Control-Expose-Headers value.
I know this question is old but I ran into this problem yesterday, and the given answer didn't work for me.
The solution I found was given in this article. Basically:
You can’t directly access the headers on the response to a fetch call - you have to iterate through after using the entries() method on the headers.
So, in your particular case you should be able to achieve the goal by using this code:
fetch(theUrl, {
method: 'POST',
body: JSON.stringify({
"placeholder": "placeholder"
})
})
.then(response => {
for (var pair of response.headers.entries()) { // accessing the entries
if (pair[0] === 'CUSTOM_HEADER_NAME') { // key you're looking for, in your case Image-Identification-Path
let imagePath = pair[1]; //// saving that value
}
}
.... })

Multiple 'Cookie' headers in a node.js request

I've seen how to make a request with a single cookie, and I've seen how to write a response with multiple cookies, but does anyone know how to write a request in node.js using http module (if possible) with multiple 'Cookie' headers?
So far the only ways I've seen to make a request in node.js involve passing an object as the parameter to a function, which would require having two identical keys.
headers = {
Cookie: firstCookie,
Cookie: secondCookie
}
so wouldn't work.
This is a node.js question, but I'm not extremely confident with http, so I'm not sure if there isn't a way to set two distinct cookies in header. Is it possible to concatenate the two into a single header? Would a request with two separately defined cookies vary from one with a single header containing both?
The 'Cookie' property you added is a direct header in your HTTP request.
You should use only one 'Cookie' header and encode your cookies properly to one valid cookie header string, like that:
var headers = {
Cookie: 'key1=value1; key2=value2'
}
Also, instead of using nodeJS native HTTP client which usually will make you write lots of boilerplate code, I would recommend you to use a much simplified library like Requestify..
This is how you can make an HTTP request with cookies using requestify:
var requestify = require('requestify');
requestify.get('http://example.com/api/resource', {
cookies: {
'key1': 'val1',
'key2': 'val2',
}
})
.then(function(response) {
// Get the response body (JSON parsed or jQuery object for XMLs)
response.getBody();
}
);

Adding HTTP Basic Authentication Header to Backbone.js Sync Function Prevents Model from Being Updated on Save()

I'm working on a web application that is powered by a restful API written with Python's CherryPy framework. I started out writing the user interface with a combination of jQuery and server side templates, but eventually switched to Backbone.js because the jQuery was getting out of hand.
Unfortunately, I'm having some problems getting my models to sync with the server. Here's a quick example from my code:
$(function() {
var User = Backbone.Model.extend({
defaults: {
id: null,
username: null,
token: null,
token_expires: null,
created: null
},
url: function() {
return '/api/users';
},
parse: function(response, options) {
console.log(response.id);
console.log(response.username);
console.log(response.token);
console.log(response.created);
return response;
}
});
var u = new User();
u.save({'username':'asdf', 'token':'asdf'}, {
wait: true,
success: function(model, response) {
console.log(model.get('id'));
console.log(model.get('username'));
console.log(model.get('token'));
console.log(model.get('created'));
}
});
});
As you can probably tell, the idea here is to register a new user with the service. When I call u.save();, Backbone does indeed send a POST request to the server. Here are the relevant bits:
Request:
Request URL: http://localhost:8080/api/users
Request Method: POST
Request Body: {"username":"asdf","token":"asdf","id":null,"token_expires":null,"created":null}
Response:
Status Code: HTTP/1.1 200 OK
Content-Type: application/json
Content-Length: 109
Response Body: {"username": "asdf", "created": "2013-02-07T13:11:09.811507", "token": null, "id": 14, "token_expires": null}
As you can see, the server successfully processes the request and sends back an id and a value for created. But for some reason, when my code calls console.log(u.id);, I get null, and when my code calls console.log(u.created);, I get undefined.
tl;dr: Why isn't Backbone.js persisting changes to my objects after a call to save()?
Edit:
I've modified the above code so that the model properties are accessed using the get function in a success callback. This should solve any concurrency problems with the original code.
I've also added some console logging in the model's parse function. Oddly enough, each of these is undefined... Does that mean that Backbone.js is failing to parse my response JSON?
Edit 2:
A few days ago, I found out that issue was actually a custom header that I was adding to every request to enable HTTP Basic Authentication. See this answer for details.
This code:
u.save();
console.log(u.id);
console.log(u.username);
console.log(u.token);
console.log(u.created);
Runs immediately... after that there is nothing to run and the queued ajax request begins. The response then comes a bit later and only at that point have the values changed.
It also seems that those properties are not directly on the object, but the asynchronous processing of the save still holds in that you wouldn't get expected results even if you corrected that code to have console.log(u.get("id")) etc.
I figured out the issue, although I'm still at a loss to explain why it's an issue at all. The web app that I'm building has an authentication process that requires an HTTP basic Authentication header to be passed with all requests.
In order to make this work with Backbone, I overrode the Backbone.sync function and changed line 1398 to add the header.
Original Code:
var params = {type: type, dataType: 'json'};
Modified Code:
var params = {
type: type,
dataType: 'json',
headers: {
'Authorization': getAuthHash()
}
};
The function getAuthHash() just returns a Base64 string that represents the appropriate authentication information.
For some reason, the addition of this header makes the sync/save functions fail. If I take it out, everything works as you might expect it to.
The bounty is still open, and I'll happily reward it to anybody who can explain why this is, as well as provide a solution to the problem.
Edit:
It looks like the problem was the way that I was adding the header to the request. There's a nice little JavaScript library available on Github that solves this problem by correctly adding the HTTP Basic Auth header.
i have tested your code, its works fine for me.
See Demo here, jsfiddle.net/yxkUD/
Try to add a custom beforeSend method to the ajax request to add the custom header.
For example:
https://github.com/documentcloud/backbone/blob/master/backbone.js#L1424

Categories

Resources