I'm trying to handle the response of a CGI script (located on a remote machine) in a PHP generated HTML page on an apache server that I'm working on.
A little background first. The user upon accessing the webpage is asked to login using username and password (htaccess). Upon login, the username and the site-code of the user in the organization is determined. For example let:
username: user
sitecode: IN88
I'm handling the CGI call in an HTML form as follows:
<form method="POST"
action="http://path/scriptName.cgi?userid=user&siteid=IN88&type=desktop">
<input type="submit" value="Create Account"></input>
<div id="result"></div>
</form>
And handling the response as follows (The script return simple text as output saying SUCCESS, if account is created, or SKIPPED, if the users account already exists):
<script>
$(function() {
$('form').submit(function(event) {
event.preventDefault();
$.ajax({
url: 'http://path/script.cgi?userid=user&siteid=IN88&type=desktop',
type: 'POST',
dataType: 'text',
beforeSend: function() {
$('#result').html("Sending Request to Server ...");
},
success: function (response) {
console.log('Success response: ' + response);
var text = "Account Registration Status: " + response;
$('#result').html(text);
},
error: function(response){
console.log('Error response: ' + response);
$('#result').html("Account Registration Status: " + response);
}
});
});
});
</script>
The error that I'm getting is:
XMLHttpRequest cannot load http://path/script.cgi?userid=user&siteid=IN88&type=desktop. No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'myMachineName' is therefore not allowed access.
Now from the error, it is clear to me that there is some access issue and the server machine where the CGI script is located is not giving access to HTTP requests from the originating machine. I tried adding the following header information in the AJAX request:
'Access-Control-Allow-Origin': '*'
But this didn't work either. Any help would be greatly appreciated.
Thanks.
Your URL is incorrect. http://path/script is going to try and hit a server named path. Since your ajax code was loaded from your server (e.g. example.com), you're effectively trying to do a cross-domain request. The browser won't even bother trying to initiate an http request, it'll get rejected immediately without ever hitting the network.
For simple AJAX requests, your url can literally be just url: '/path/script?....', and the browser will fill in the http://example.com automatically, exactly as if you had <img src="kittens.jpg"> in your html.
That or you add in the name of your server, so it's url: 'http://example.com/path/etc...'
Related
I have just started working with react js and started facing some issue from some client side systems. while posting data to a php file from react js with (axios or ajax) session_id() resetting and regenerating on every request/call. also session variable resetting. i have tried some solutions from stackoverflow but nothing worked.i have faced this issue from my localhost and some other systems. but working with some others systems too. it's like working with 50% system and not working with 50% system.
But if i do this same thing without react js from normal html file with ajax to php file. session id is not resetting or regenerating.
PHP FILE
<?php
session_start();
$session_val = session_id();
header("Access-Control-Allow-Origin: *");
echo $session_val;
?>
AJAX CALL FROM REACT JS
$.ajax({
url: "http://localhost/test.php",
type: "POST",
data: {
action: action,
},
success: function (data) {
},
error: function (jqXHR, text, errorThrown) {
console.log(jqXHR + " " + text + " " + errorThrown);
},
});
AXIOS CALL
axios
.post('http://localhost/test.php', this.state)
.then(response =>{
console.log(response.data);
})
.catch(
error => {
console.log(error)
});
Looks like your browser isn't sending the Cookie header with the AJAX request (they do this by default, but I can't figure out why it's not sending it).
When your PHP server sees that there's no Cookie header, it assumes that the SESSONID is not set and creates a new one and instructs your client to store the new SESSIONID in its cookies. Since this new SESSIONID is not sent in the next request, your server again assumes the same thing, and again creates a new SESSIONID. And this process happens everytime.
To learn more about how cookies work, you could refer Using HTTP Cookies.
You can manually get your cookies and add it as a Header to your AJAX request in JQuery like so:
$.ajax({
url: "http://localhost/test.php",
type: "POST",
headers: {"Cookie": document.cookie},
data: {
action: action,
},
success: function (data) {
},
error: function (jqXHR, text, errorThrown) {
console.log(jqXHR + " " + text + " " + errorThrown);
},
});
headers attribute of the AJAX object is used to add extra headers to your HTTP Request. Here we're getting the cookies the browser has stored using document.cookie and adding it to the Cookie header in the request headers.
Let me know if this solves your issue.
As per Shreyas Sreenivas comment i used xhrFields: { withCredentials: true } for ajax and
axios.post('https://example.com/file.php', this.state,{withCredentials: true});
for axios and it worked. But some Cross-Origin Request Blocked error occurred which needs to be fixed.
My objective is to check whether a URL is valid or not from client side. I tried the following things:
1. Tried using a ajax request using dataType as JSON. - Got the Cross-Origin Request Blocked error.
2. Tried using the JSONP as datatype. - Worked fine for some websites like google.com but it cribed for others like facebook.com
Got the error like "Refused to execute script from
FaceBook
callback=jQuery32107833494968122849_1505110738710&_=1505110738711'
because its MIME type
('text/html') is not executable, and strict MIME type checking is enabled."
Is there any workaround for this. I just want to make sure that the URL is valid irrespective of the content in the response.
Following is the code I wrote:
<html>
<body>
<script
src="https://ajax.googleapis.com/ajax/libs/jquery/3.2.1/jquery.min.js">
</script>
<script>
function CallPageMethod() {
$.ajax({
type: "GET",
url: "https://www.google.com/",
dataType: "jsonp",
success: function (data, textStatus, xhr) {
alert("Success");
},
error: function (data, textStatus, xhr) {
if (data.status === 200) {
alert("Finally I am done")
} else {
alert("Error");
}
},
});
}
</script>
<Button onclick="CallPageMethod()">Test URL</Button>
</body>
</html>
Any Suggestions or any alternative approach that I should follow to resolve this issue?
Not properly, but Most sites have a favicon.ico either from the site directly or provided from the hosting company for the site if it is a 404 image.
<img src="https://www.google.com/favicon.ico"
onload="alert('icon loaded')">
<script src="https://ajax.googleapis.com/ajax/libs/jquery/3.2.1/jquery.min.js"
onload="alert('ajax loaded')"></script>
Although iframe and object do have onload events, invalid pages also trigger the event.
This would be the fastest site test I can think of ...
var img = new Image();
img.onload = function () {
alert("image width is " + img.naturalWidth + " not zero so site is valid");
}
img.src = "https://www.google.com/favicon.ico";
As for facebook, the each page uses resources from another url, iframes are blocked as well as scripts. You would need to make the request from a server to test if a page existed.
You're best off writing a proxy on your server so:
Client hits your server with the URL you want to check
Your server makes the request to that URL and gets a response (or not)
Server returns status code to the client
This way will avoid the CORS issues you're having to navigate and will allow you to set any HTTP headers you need to.
$.ajax({
type: "POST",
url: "/someurl",
data: $("#send_application_number").serialize(),
success: function(response, textStatus, request) {
#responseType = request.getResponseHeader('Location')
$("#captchapopup").foundation("reveal","open")
$("#viewstate").val(response['viewstate'])
$("#eventvalidate").val(response['eventvalidate'])
$("#appl_num").val(response['appl_num'])
#loc = getcaptchaimage(response['viewstate'], response['eventvalidate'])
loc = '/anotherurl?viewstate=' + response['viewstate'] + '&eventvalidate=' + response['eventvalidate']
$("#captchaimage").attr('src', loc)
},
error: function() {
alert("Could not get response to session login request");
}
});
As you can see this is a script for running a modal pop-up to display a captcha. BUT, I may redirect to another url as well, instead of displaying the modal, which is why I sent a 302 redirect (and a url of course) from back-end assuming it would redirect automatically. It did , BUT the rest of the script also executed as well, which brought it back to the modal.
How do I achieve an if-else kind of redirect? That if 'Location' is present, redirect and don't execute the rest of the script?
AJAX is a request in the background, and that means you can not redirect the “foreground” (browser UI) from there via HTTP.
Use location.href="…" to redirect from within your success handler. (Have your server send it a value that it can base that decision on.)
I am trying to create a user login/signup on my remote app that accesses user data on Salesforce. I am doing this from javascript via Salesforce REST API. This was not working using requests straight from javascript due to CORS restrictions, so I found this example. My code is as follows:
var result = sforce.connection.login('example#provider.com', 'pass'+'securityToken');
sforce.connection.init(result.sessionId, 'https://login.salesforce.com/services/oauth2/token')
sforce.connection.remoteFunction({
url: 'https://login.salesforce.com/services/oauth2/token',
requestHeaders: {
"Authorization" : "Bearer " +__sfdcSessionId,
"Content-Type":"application/json",
"Connection":"Keep-Alive"
},
method: "GET",
onSuccess : function(response) {
console.log("Success " + response)
},
onFailure: function(response) {
console.log("Failed " + response)
}
});
When I run this code I get the following errors:
1) Refused to set unsafe header "User-Agent"
2) POST http:///services/Soap/u/31.0 404 (Not Found)
3) Remote invocation failed, due to:
<html><head>
<title>404 Not Found</title>
</head><body>
<h1>Not Found</h1>
<p>The requested URL /services/Soap/u/31.0 was not found on this server.</p>
</body></html>
status code:
Using a code editor I can see that the errors are occurring after hitting the sforce.connection.login() script and never making it to the sforce.connection.init() script in my code.
How do I resolve this issue so that I may log a user in from my remote web app and gain access to the user information within salesforce?
It seem your issue is like the one in this post
XMLHttpRequest isn't allowed to set these headers, they are being set automatically by the browser. The reason is that by manipulating these headers you might be able to trick the server into accepting a second request through the same connection, one that wouldn't go through the usual security checks - that would be a security vulnerability in the browser.
How can i insert the parsed html content into my webpage if i have only a link of the another webpage(get the html content from this webpage). I am using ajax call and i am getting the error i write the code below. And browser is not the issue
I want it as in Facebook but not in php
<script>
jQuery.support.cors = true;
$.ajax({
type:"GET",
url:"http://www.hotscripts.com/forums/javascript/3875-how-read-web-page-content-variable.html",
dataType:"html",
crossDomain:true,
beforeSend: function(xhr)
{
xhr.overrideMimeType('text/plain; charset=UTF-8');
},
success:function(data) {
alert(data);
$("body").html(data);
},
error:function(errorStatus,xhr) {
alert("Error",errorStatus,xhr);
}
});
</script>
May be your browser don't support cors.
http://caniuse.com/cors or another domain don't send back Access-Control-Allow-Origin header.
Thanks for the comments made by Quentin.
Another option is to use jSONP.
<script>
var query = "http://query.yahooapis.com/v1/public/yql?q=" +
encodeURIComponent("SELECT * FROM html WHERE url = 'http://www.hotscripts.com/forums/javascript/3875-how-read-web-page-content-variable.html'") +
"&format=json";
$.ajax({
type:"GET",
url: query,
crossDomain:true,
beforeSend: function(xhr)
{
xhr.overrideMimeType('text/plain; charset=UTF-8');
},
success:function(data) {
alert(data);
},
error:function(errorStatus,xhr) {
alert("Error",errorStatus,xhr);
}
});
</script>
Third option is to make a request through a proxy script located on your domain.
In reply to:
"i have used jQuery.support.cors=true; still there is a prob"
As I have said the server should return you the necessary headers.
If the other side does not allow it to become nothing can be done.
Check this:
http://en.wikipedia.org/wiki/Cross-origin_resource_sharing
To initiate a cross-origin request, a browser sends the request with an Origin HTTP header. The value of this header is the domain that served the page. For example, suppose a page from http://www.example-social-network.com attempts to access a user's data in online-personal-calendar.com. If the user's browser implements CORS, the following request header would be sent to online-personal-calendar.com:
Origin: http://www.example-social-network.com
If online-personal-calendar.com allows the request, it sends an Access-Control-Allow-Origin header in its response. The value of the header indicates what origin sites are allowed. For example, a response to the previous request would contain the following:
Access-Control-Allow-Origin: http://www.example-social-network.com
If the server does not allow the cross-origin request, the browser will deliver an error to example-social-network.com page instead of the online-personal-calendar.com response.
To allow access from all domains, a server can send the following response header:
Access-Control-Allow-Origin: *
This is generally not appropriate. The only case where this is appropriate is when a page or API response is considered completely public content and it is intended to be accessible to everyone, including any code on any site.
The value of "*" is special in that it does not allow requests to supply credentials, meaning HTTP authentication, client-side SSL certificates, nor does it allow cookies to be sent