I'm trying to write a piece of Zap code with Run JavaScript to test the HTTP header response of a URL GET. Specifically, I'm interested in the return status and the location (basically, if it's a 302, want to know what the redirect location is).
fetch('https://www.example.com/', { method: 'GET', redirect: 'manual' })
.then(function(res) {
return res.json();
})
.then(function(json) {
var output = {status: json.status, location: json.headers.get('location')};
callback(null, output);
})
.catch(callback);
I've tried the above but (a) the test always returns rawHTML (which suggests it's following a the redirect, and (b) the output variables in the Send Outbound Zap step don't pick up anything useful (again, "Raw HTML", "ID", "Runtime Meta Logs", etc. but nothing about my headers).
You may not be able to access the Location header due to the same origin policy in most browsers: https://developer.mozilla.org/en-US/docs/Web/Security/Same-origin_policy
Furthermore, you can't stop the AJAX call from following a redirect, so that may cause you issues: How to prevent jQuery ajax from following a redirect after a post?
It looks like you are using the new built-in fetch function: https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API/Using_Fetch
If so, in the provided example, I dont think you need the .json() call. I got the code to run like like below, but there is no redirect at example.com so not sure exactly how your situation will handle. Also, keep in mind the same-origin policy which will likely prohibit you from accessing location header.
var callback = function(a,b){
console.log(a,b)
};
fetch('https://www.example.com/', { method: 'GET', redirect: 'manual' })
.then(function(res) {
console.log (res)
var output = {status: res.status, location: res.headers.get('location')};
callback(null, output);
})
.catch(callback);
If you control the server resource, then you could possibly do something on the server, like adding another header that won't be blocked, many sites do that adding a X-Location option that browsers don't block.
Since we're using https://github.com/bitinn/node-fetch#options under the hood - specifically the redirect: 'follow' option. It even offers the exact "set to manual to extract redirect headers".
You might try experimenting with a local Node.js REPL to figure it out. If you see it working locally but not in Zapier - just file a bug to contact#zapier.com.
I managed to get this code working:
fetch('https://www.example.com/', { method: 'GET', redirect: 'manual', follow: 0 })
.then(function(res) {
var output = {status: res.status, location: res.headers._headers.location};
callback(null, output);
})
.catch(callback);
The underlying issue appears to be (as evidenced by the output variables with "id" and "rawHTML") that the fields were somehow "stuck". When I (1) deleted the Run Javascript step, (2) reinserted a new one with the above code, the correct output fields were then returned and subsequently became available to the Send Outbound Email step.
Related
Everytime I make a fetch request from my frontend using the following js code I receive a 400 Bad request status. The body of the response has an error object saying: "A non-empty request body is required".
When I inspect the request section on my devtools network tab it says "no payload for this request". So it looks to me it's not sending the body section of my Fetch.
It does reach the .then() method afterwards.
This is the Typescript Code:
fetch(`api/person/${person.id}`, {
method: "PUT",
headers: {
"Content-Type": "application/json"
},
body: JSON.stringify(person)
})
.then(() => this.router.navigate(["/"]))
this is the C# backend:
[HttpPut("{id}")]
public IActionResult Put(int id, Person person)
{
person.Id = id;
try
{
var updatedPerson = _personRepository.Update(person);
if (updatedPerson != null)
{
return Ok(updatedPerson);
}
return NotFound();
}
catch (ArgumentNullException)
{
return BadRequest();
}
}
Note that the request doesn't even reach this controller. No break points will be reached if I place any here.
This is a single page application I run from Visual Studio 2019.
It works with postman however, returning the 200 Ok status code and an object back, and reaching backend breakpoints. The request URL contains the int id and the body containing a Json object.
Okay this is not a question that has a clear answer.
Here are steps you could need to take to find out:
Make sure in your fetch request the argument person is really exist, just do console.log(person) before fetch
Make sure server accepts 'application/json' type content. Though in this case the response code should have been different.
Check the response headers, and are the origins of back end and front end are the same? What you need to see are the following:
Access-Control-Allow-Headers: *
Access-Control-Allow-Methods: PUT
Access-Control-Allow-Origin: *
Postman works as a native app on your PC and has different way of sending requests rather than your browser, sometimes this causes unclear results.
Here's a way to test what is happening.
Go to your main web site (http://localhost:5001 or whatever) in your
browser
Open up Browser Dev Tools (F12 in most browsers)
copy / paste the following fetch (code below) into your Dev console
& press
What happens? Do you get data back (printed in console)?
This will probably get you to the next step you need to take.
fetch("api/person/1")
.then(response => response.json())
.then(data => console.log(data));
I would like to make an Axios get request to a private server running Flask.
In case there is an internal error in the back-end it returns an response object and an error code.:
response_object = {
"Success": False,
'error': err.message
}
return response_object, 400
The served response_object should be accessible the front-end (React.js).
axios
.get(`http://127.0.0.1:5000/data`, {
data: null,
headers: { "Content-Type": "application/json" }
})
.then(response => {
console.log(response);
})
.catch(function(error) {
console.log(error.toJSON());
});
I would expect the error to include the response object. If the URL is accessed manually in the browser the error data is visible. If there is no error in the back-end the get requests works properly.
After googling for some time I found some issues that might relate to the mentioned problem. (That is why empty data is passed in the a get request).
https://github.com/axios/axios/issues/86
https://github.com/axios/axios/issues/86
Please note that I am self taught so I might miss something obvious here. Thank you all and all the best.
I'll copy/paste my comment here so other can easily see the answer for the question
if it's a 400 status code that it's throwing (you can confirm by using the Network tab in your browser), it will fall into the catch block, the only concern is the toJSON() call... just do a simple console.log(error.message) to check if you ever get there...
I leave you a simple example so you see the catch in action
more information:
in Axios, the response text is in response.data
if you are receiving a JSON, the response.data will be automatically parsed
you do not need to pass data: null as that's the default behavior, and it's not used in a GET call (you won't pass a body in a GET call)
New to VueJS. I have the following code that retrieves data from the Controller using axios:
SubmitForm: function () {
axios({
method: 'post',
url: '/Home/SubmitedForm',
data: { "Fields": this.$data }
}).then(res => {
alert('Successfully submitted the form ');
window.close();
}).catch(err => {
if (err.response.status == 409) {
alert(`Already exists. See details: ${err}`)
}
else {
alert(`There was an error submitting your form. See details: ${err}`)
}
});
When the Controller method SubmittedForm returns 409, I want to throw a specific alert else just throw a generic alert. Based on this page: https://gist.github.com/fgilio/230ccd514e9381fafa51608fcf137253 I wrote the above code. However, even thought the http status returned is 409, it still show the generic alert.
I'm pretty sure I'm missing some understanding here. Can someone please help me find out what am I doing wrong here?
Works as expected on localhost but after publishing on azurewebsites it again displays the generic error.
Maybe your API endpoint brakes CORS policy, you can't read status of such error then (despite the fact that it is visible in Networks tab in dev tools).
You can install a browser extension like "CORS everywhere" to test if it works then, but any call to API blocked by CORS will show a warning/error in the browser's console by default.
Probably because err.response.status is a string and you are comparing to number
I'm trying to use a Cloudflare Worker to proxy a POST request to another server.
It is throwing a JS exception – by wrapping in a try/catch blog I've established that the error is:
TypeError: A request with a one-time-use body (it was initialized from a stream, not a buffer) encountered a redirect requiring the body to be retransmitted. To avoid this error in the future, construct this request from a buffer-like body initializer.
I would have thought this could be solved by simply copying the Response so that it's unused, like so:
return new Response(response.body, { headers: response.headers })
That's not working. What am I missing about streaming vs buffering here?
addEventListener('fetch', event => {
var url = new URL(event.request.url);
if (url.pathname.startsWith('/blog') || url.pathname === '/blog') {
if (reqType === 'POST') {
event.respondWith(handleBlogPost(event, url));
} else {
handleBlog(event, url);
}
} else {
event.respondWith(fetch(event.request));
}
})
async function handleBlog(event, url) {
var newBlog = "https://foo.com";
var originUrl = url.toString().replace(
'https://www.bar.com/blog', newBlog);
event.respondWith(fetch(originUrl));
}
async function handleBlogPost(event, url) {
try {
var newBlog = "https://foo.com";
var srcUrl = "https://www.bar.com/blog";
const init = {
method: 'POST',
headers: event.request.headers,
body: event.request.body
};
var originUrl = url.toString().replace( srcUrl, newBlog );
const response = await fetch(originUrl, init)
return new Response(response.body, { headers: response.headers })
} catch (err) {
// Display the error stack.
return new Response(err.stack || err)
}
}
A few issues here.
First, the error message is about the request body, not the response body.
By default, Request and Response objects received from the network have streaming bodies -- request.body and response.body both have type ReadableStream. When you forward them on, the body streams through -- chunks are received from the sender and forwarded to the eventual recipient without keeping a copy locally. Because no copies are kept, the stream can only be sent once.
The problem in your case, though, is that after streaming the request body to the origin server, the origin responded with a 301, 302, 307, or 308 redirect. These redirects require that the client re-transmit the exact same request to the new URL (unlike a 303 redirect, which directs the client to send a GET request to the new URL). But, Cloudflare Workers didn't keep a copy of the request body, so it can't send it again!
You'll notice this problem doesn't happen when you do fetch(event.request), even if the request is a POST. The reason is that event.request's redirect property is set to "manual", meaning that fetch() will not attempt to follow redirects automatically. Instead, fetch() in this case returns the 3xx redirect response itself and lets the application deal with it. If you return that response on to the client browser, the browser will take care of actually following the redirect.
However, in your worker, it appears fetch() is trying to follow the redirect automatically, and producing an error. The reason is that you didn't set the redirect property when you constructed your Request object:
const init = {
method: 'POST',
headers: event.request.headers,
body: event.request.body
};
// ...
await fetch(originUrl, init)
Since init.redirect wasn't set, fetch() uses the default behavior, which is the same as redirect = "automatic", i.e. fetch() tries to follow redirects. If you want fetch() to use manual redirect behavior, you could add redirect: "manual" to init. However, it looks like what you're really trying to do here is copy the whole request. In that case, you should just pass event.request in place of the init structure:
// Copy all properties from event.request *except* URL.
await fetch(originUrl, event.request);
This works because a Request has all of the fields that fetch()'s second parameter wants.
What if you want automatic redirects?
If you really do want fetch() to follow the redirect automatically, then you need to make sure that the request body is buffered rather than streamed, so that it can be sent twice. To do this, you will need to read the whole body into a string or ArrayBuffer, then use that, like:
const init = {
method: 'POST',
headers: event.request.headers,
// Buffer whole body so that it can be redirected later.
body: await event.request.arrayBuffer()
};
// ...
await fetch(originUrl, init)
A note on responses
I would have thought this could be solved by simply copying the Response so that it's unused, like so:
return new Response(response.body, { headers: response.headers })
As described above, the error you're seeing is not related to this code, but I wanted to comment on two issues here anyway to help out.
First, this line of code does not copy all properties of the response. For example, you're missing status and statusText. There are also some more-obscure properties that show up in certain situations (e.g. webSocket, a Cloudflare-specific extension to the spec).
Rather than try to list every property, I again recommend simply passing the old Response object itself as the options structure:
new Response(response.body, response)
The second issue is with your comment about copying. This code copies the Response's metadata, but does not copy the body. That is because response.body is a ReadableStream. This code initializes the new Respnose object to contain a reference to the same ReadableStream. Once anything reads from that stream, the stream is consumed for both Response objects.
Usually, this is fine, because usually, you only need one copy of the response. Typically you are just going to send it to the client. However, there are a few unusual cases where you might want to send the response to two different places. One example is when using the Cache API to cache a copy of the response. You could accomplish this by reading the whole Response into memory, like we did with requests above. However, for responses of non-trivial size, that could waste memory and add latency (you would have to wait for the entire response before any of it gets sent to the client).
Instead, what you really want to do in these unusual cases is "tee" the stream so that each chunk that comes in from the network is actually written to two different outputs (like the Unix tee command, which comes from the idea of a T junction in a pipe).
// ONLY use this when there are TWO destinations for the
// response body!
new Response(response.body.tee(), response)
Or, as a shortcut (when you don't need to modify any headers), you can write:
// ONLY use this when there are TWO destinations for the
// response body!
response.clone()
Confusingly, response.clone() does something completely different from new Response(response.body, response). response.clone() tees the response body, but keeps the Headers immutable (if they were immutable on the original). new Response(response.body, response) shares a reference to the same body stream, but clones the headers and makes them mutable. I personally find this pretty confusing, but it's what the Fetch API standard specifies.
I recently restructured my API on AWS Gateway to make all my Lambda functions use proxy integration - before that, every single parameter was passed in as a path parameter (awful, I know.)
I never had any issues with CORS then, and I've tried several things over the past few hours to fix the issue discussed in the topic line.
First, I used a proxy resource and used an "ANY" method, but when that gave me CORS issues, I enabled CORS on the API method and tried again - still nothing. So, I tried changing it so that it was a "POST" request instead and enabling CORS - still nothing. And I made sure to deploy after every setting change. Then, I got rid of the proxy and instead just made a "POST" method with CORS enabled, and still nothing.
I'm using Angular's http post method.
Edit:
I'm using Angular 1.6.4, and this is the code I'm using to call the API:
this.checkRegistered = function(email){
var data = { Email : email};
var toSend = JSON.stringify(data);
return $http.post('link', toSend);
};
That's in my service for the angular module, and it's being called from this function in the controller:
function CheckIsRegistered(email)
{
return userService.checkRegistered(email).then(function(res){
if (res.data.statusCode === 200){
return res.data.body;
}});
}
I've configured all the parameters so that "Email" is what it should be expecting, and I did replace the word 'link' with the actual link.
When I enable CORS through the console, I assign the headers as follows:
Access-Control-Allow-Headers: 'Content-Type,X-Amz-Date,Authorization,X-Api-Key,X-Amz-Security-Token'
Access-Control-Allow-Origin: '*'
This is especially infuriating because I've actually worked with this exact issue before and solved it fairly easily, but now that I'm using Lambda's proxy integration I've run into this issue again and I can't quite seem to figure it out.
Any help is appreciated.
I figured out the problem - I had to add the headers for CORS to the Lambda response. Here's the code snippet for anyone else having similar problems:
connection.query('arbitrary query', [params], function (error, results, field) {
if (!error)
{
connection.end();
var responseBody = results;
response = { statusCode: 200,
headers: { "Access-Control-Allow-Headers": "Content-Type,X-Amz-Date,Authorization,X-Api-Key,X-Amz-Security-Token", "Access-Control-Allow-Origin": "*"},
body: JSON.stringify(responseBody)};
callback(null, response);
}
I put the querying line in there just for context of what "response" is.