Check a file on a different server - javascript

I want my user to download a file which my script generates and put it on their server (this part has been built successfully). The goal is to verify that the user has the ability to upload files to the website they claim they own. I will be checking the root of the website so an example would be http://www.google.com/file
I then want my script to check if the file is present on their server. I figured, I could use some javascript to check if the domain of the user combined with a file path would return any different HTTPresponse than 404.
SO I looked around on the internet and tried a few things. Now here is the resulting function :
/* DUMMY */
url = 'http://www.google.com/';
xhr = new XMLHttpRequest();
xhr.open("HEAD", url,true);
xhr.onreadystatechange=function() {
alert("HTTP Status Code:"+xhr.status)
}
xhr.send(null);
The url I used should exist. This should result in a 200 (or something along the lines of it exists). However, for most URL's I'll get an error 0 and following error: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://localhost' is therefore not allowed access.
Could anyone help me out with my script?

I would suggest use php to check the same. If you are wondering about the issue you are getting read about CORS.
This is a simple example to do it
$file = 'http://www.domain.com/somefile.jpg';
$file_headers = #get_headers($file);
if ($file_headers[0] == 'HTTP/1.1 404 Not Found') {
$exists = false;
}
else {
$exists = true;
}
From here : http://www.php.net/manual/en/function.file-exists.php#75064

You need Cross-Origin Resource Sharing enabled on the destination server(such as google.com where your file is).
To prevent vulnerabilities, you cannot execute JavaScript on just any foreign server. You can only do so on a server you own, by explicitly adding code in the config settings to enable requests from your client server.

I would suggest (because it's the most portable solution) to put a proxy script on your server. Something along the lines of
<?php
$url = filter_var($_GET['url'], FILTER_VALIDATE_URL);
if ($url) {
$ch = curl_init($url);
$res = curl_exec();
$code = curl_getinfo($ch, CURLINFO_HTTP_CODE);
echo json_encode(Array('success' => 1, 'status' => $code));
}
else {
echo json_encode(Array('success' => 0, 'status' => 000000));
}
You can then use XMLHTTPRequest and JSON.parse() on the Javascript side to analyze the result. You can also use the code to provide you additional data about the remote server that could always be useful.

Related

How to validate google reCaptcha on server side?

I implemented reCaptcha. After the "I am not a robot" checkbox is clicked, a token is getting generated from google.
Client side (js)
function checkCaptchaAndSubscribe(thisContext)
{
var captchaResponse = grecaptcha.getResponse();
if (captchaResponse == "") {
$captchaRequired.css('display', 'block');
return false;
}
grecaptcha.reset();
$captchaRequired.css('display', 'none');
jQuery.ajax({
url: "/black_newsletter2go/index/verify",
method: "POST",
async: "true",
data: {
recaptchaResponse: captchaResponse
},
success: function(response) {
$statusContainer.show();
if (response != "success") {
$status.html("<h2 class='nl2go_h2'>Die Captcha Validierung ist fehlgeschlagen!</h2>");
return false;
}
subscribe(thisContext);
}
});
}
I send the token to my server by using ajax and validate it there like this:
Server side (php):
public function verifyAction()
{
$captchaResponse = $this->getRequest()->getParam('recaptchaResponse');
if (!isset($captchaResponse) || empty($captchaResponse)) {
return "captcha response is empty";
}
$secretKey = Mage::Helper("recaptcha")->getSecretKey();
$url = 'https://www.google.com/recaptcha/api/siteverify';
$data = array(
'secret' => $secretKey,
'response' => $captchaResponse,
);
// use key 'http' even if you send the request to https://...
$options = array(
'http' => array(
'header' => "Content-type: application/x-www-form-urlencoded\r\n",
'method' => 'POST',
'content' => http_build_query($data)
)
);
$context = stream_context_create($options);
$result = file_get_contents($url, false, $context);
$result = json_decode($result);
//var_dump($result);
//exit();
if ($result->success
&& (strpos(Mage::getBaseUrl(), $result->hostname) !== false)) {
echo "success";
} else {
echo "fail";
}
}
This is the output of the object $result
It returns success if the checks were successfull, otherwise fail.
But is this enough? What if the attacker uses a HTTP proxy like burpsuite to change the response to success? Then he can bypass my checks and always get through? Or am I wrong?
It uses a key pair to encrypt/decrypt the info. So it's sending the info encrypted. That's why it can't be tempered with, but of course, that means you must make sure to not get the private key stolen.
There the server knows and saves the state in its storage so if the client tries to use it as "success" when it was "fail", the server will know, no matter what. So for the hacker to change the value for the client is not likely to do much, it will depend on your code, of course. If you are using that reCAPTCHA to log the user in, then obviously that login attempt will fail on the server side if the reCAPTCHA returned "fail". So whether the client is told "success" or not, it still won't be logged in. The client should never be the keeper of such a state since it can't be trusted (it can always have tainted data.)
It works in a way similar to what you'd do between a browser and a server using HTTPS.
The communication between your client and your server should also be on HTTPS to avoid some easier man in the middle (MITM) problems. However, it is always possible to have someone who becomes a proxy, which is how most MITM work, and in that case, whatever you're doing can be changed by the MITM.
The one thing that the MITM can't do is create a valid certificate for the final destination, however. In that sense, there is a protection, but many people don't verify certificates each time they connect to a website. One technique, though, has been for MITM to not give you HTTPS, only him and your server would use HTTPS and the client would remain on HTTP. Although your code could detect such, obviously the MITM can also change that code. Similarly, having a cookie set with Http-Only and Secure can enhance the security, but that too can be intercepted by a MITM.
Since the MITM can completely change your scripts, there is pretty much nothing you can do on the client's side that would help detect such a problem and on the server side, you will receive hits that look like what the client sent to you. So again, no real way to detect a MITM.
There is a post that was asking that very question: could I detect an MITM from the server side? It's not impossible, but it's rather tricky. There are solutions being put in place by new implementations/extensions to the normal HTTP solution, but those require an additional application that connects to a different system and there is no reason why such could not also be proxied by a MITM once enough people use such solutions.
The result comes from an URL that is owned by Google.
If user tampered with what is being send to your php script - then the google webservice will return a failure and you won't pass such request through.

How to get HTTP header (Content-Type) from URL with Javascript

Good afternoon! Here is what I am trying to achieve.
I have an input which allow the user to enter an URL (mostly for images) but it can also be a different type of file. I am searching for a way to verify that the url exists and also get the mime type.
Here is a jsfiddle of my javascript tests.
I found a way to do it using a PHP and AJAX with a function like that:
PHP:
function get_url_content_type( $url ) {
$header = get_headers( $url, 1 );
if ( isset( $header['Content-Type'] ) ) {
return $header['Content-Type'];
}
}
I am not sure that is the right way to do it, does anyone have better ideas?
Many thanks!
jQuery will already convert the response based on the content type (check this url for details) if you dont specify a type on the $.ajax() call.
So all you can do is check the type of the generated response, to deduce the Content-Type that was sent:
$.get("myPage.html", { }, function(data) {
if(typeof(data) === "string") {
//html
} else {
//JSON
}
});
In this case, it doesn't work because google explicitly disables Cross-Origin Request on its hosted images. If you run your tests with Firebug enabled, you will get the message:
Cross-Origin Request Blocked: The Same Origin Policy disallows reading
the remote resource at https://www.google.com/images/srpr/logo11w.png.
This can be fixed by moving the resource to the same domain or
enabling CORS.
However, enabling CORS will need some work at server side that I don't think google is willing to do.
EDIT:
If you want to generate a client-side preview of some files, try using something like:
https://github.com/markserbol/urlive

JavaScript - XMLHttpRequest, Access-Control-Allow-Origin errors

I'm attempting to send a XMLHttpRequest to a paste site. I'm sending an object containing all the fields that the api requires, but I keep getting this issue. I have read over the issue, and I thought:
httpReq.setRequestHeader('Access-Control-Allow-Headers', '*');
Would fix it,but it didn't. Does anyone have any information on this error and/or how I can fix it?
Here is my code:
(function () {
'use strict';
var httpReq = new XMLHttpRequest();
var url = 'http://paste.ee/api';
var fields = 'key=public&description=test&paste=this is a test paste&format=JSON';
var fields2 = {key: 'public', description: 'test', paste: 'this is a test paste', format: 'JSON'};
httpReq.open('POST', url, true);
console.log('good');
httpReq.setRequestHeader('Access-Control-Allow-Headers', '*');
httpReq.setRequestHeader('Content-type', 'application/ecmascript');
httpReq.setRequestHeader('Access-Control-Allow-Origin', '*');
console.log('ok');
httpReq.onreadystatechange = function () {
console.log('test');
if (httpReq.readyState === 4 && httpReq.status === 'success') {
console.log('test');
alert(httpReq.responseText);
}
};
httpReq.send(fields2);
}());
And here is the exact console output:
good
ok
Failed to load resource: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://127.0.0.1:40217' is therefore not allowed access. http://paste.ee/api
XMLHttpRequest cannot load http://paste.ee/api. No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://127.0.0.1:40217' is therefore not allowed access. index.html:1
test
Here is the console output when I test it locally on a regular Chromium browser:
good
ok
XMLHttpRequest cannot load http://paste.ee/api. No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'null' is therefore not allowed access. index.html:1
test
I think you've missed the point of access control.
A quick recap on why CORS exists:
Since JS code from a website can execute XHR, that site could potentially send requests to other sites, masquerading as you and exploiting the trust those sites have in you(e.g. if you have logged in, a malicious site could attempt to extract information or execute actions you never wanted) - this is called a CSRF attack. To prevent that, web browsers have very stringent limitations on what XHR you can send - you are generally limited to just your domain, and so on.
Now, sometimes it's useful for a site to allow other sites to contact it - sites that provide APIs or services, like the one you're trying to access, would be prime candidates. CORS was developed to allow site A(e.g. paste.ee) to say "I trust site B, so you can send XHR from it to me". This is specified by site A sending "Access-Control-Allow-Origin" headers in its responses.
In your specific case, it seems that paste.ee doesn't bother to use CORS. Your best bet is to contact the site owner and find out why, if you want to use paste.ee with a browser script. Alternatively, you could try using an extension(those should have higher XHR privileges).
I've gotten same problem.
The servers logs showed:
DEBUG: <-- origin: null
I've investigated that and it occurred that this is not populated when I've been calling from file from local drive. When I've copied file to the server and used it from server - the request worked perfectly fine
function cors() {
var xhttp = new XMLHttpRequest();
xhttp.onreadystatechange = function() {
if (this.readyState == 4 && this.status == 200) {
document.getElementById("emo").innerHTML = alert(this.responseText);
}
};
xhttp.withCredentials = true;
xhttp.open("GET", "http://owasp-class.lab:4444/api/get_info", true);
xhttp.setRequestHeader('Content-Type', 'application/x-www-form-urlencode');
xhttp.send();
}

Struggling to communicate with the utorrent web API

I am trying to access utorrents web api, it uses a token authentication system which is detailed here
the JavaScript on my page is
<script>
$.getJSON("http://XXX.XXX.XXX.XXX/lib/token.php", function(response) {
var head = document.getElementsByTagName('head')[0];
var script = document.createElement('script');
script.type = 'text/javascript';
//script.onreadystatechange = function () {
// if (this.readyState == 'complete') utorrent();
//}
//script.onload = utorrent();
script.src = 'http://XXX.XXX.XXX.XXX:8080/gui/?list=1&token=' + response.token;
head.appendChild(script);
});
</script>
simply retrieving the token from a php file and passing it along the chain, i have confirmed that the token is being passed and is not being poisonned, my PHP document is below
<?php
header('Content-type: text/json');
$token = file_get_contents('http://[username]:[password]#XXX.XXX.XXX.XXX:8080/gui/token.html');
$token = str_replace("<html><div id='token' style='display:none;'>", "", $token);
$token = str_replace("</div></html>", "", $token);
$response = array('token' => $token);
echo json_encode($response);
?>
this gives me a confirmation of the token
Object {token: "GMt3ryaJE64YpXGN75-RhSJg-4gOW8n8XfTGYk_ajpjNLNLisR3NSc8tn1EAAAAA"}
but then i receive a 400 error code when retrieving the list
GET http://XXX.XXX.XXX.XXX:8080/gui/?list=1&token=GMt3ryaJE64YpXGN75-RhSJg-4gOW8n8XfTGYk_ajpjNLNLisR3NSc8tn1EAAAAA 400 (ERROR)
Any help/thoughts/idea's would be greatly appreciated
just adding my 2 cents.
I've been doing a similar implementation in .NET MVC - I was able to get the token as you did, but the list=1 feature didn't work for me either, getting the 400 bad request code (as you have found).
The solution for me:
In the token.html response, there is a token in the div and also a GUID in the header.
To break it down:
Call token.html with uTorrent credentials
In the response content, parse the html to get the token
in the response header, there is a value with key Set-Cookie, which looks like
Set-Cookie: GUID=<guid value>
I needed to use this value (GUID=<guid value>) in all requests being sent back, as well as the token and it worked!
I'm not sure what the implementation is in PHP to do this however :)
Also quick note, I've been trying to get values through jQuery's $.getJSON and $.Ajax method without any success, because the browser (chrome) I'm using has strict guidelines on cross domain requests, and it doesn't look like uTorrent is implementing JSONP.
Hope this helps!
The 400 Error message means you are communicating with a bad request.
The MIME media type for JSON text is application/json .
use text/plain or application/json, not text/json.
application/json sometimes causes issues on Chrome, so you might want to stick with text/plain in this case.
Have you tried changing the order of the query parameters?
eg: http://localhost:8080/gui/?token=<token_uuid>&list=1
Reference: https://github.com/bittorrent/webui/wiki/TokenSystem#examples
UPDATE
I ran into a similar problem trying to create and XMPPBot for utorrent client in python.
#m.t.bennett was correct. You need to save the session information as well.
When you receive the response from token.html, capture the cookie information as well.
Usually there are 2 params: GUID and sessions. You need to put them in the header for all your subsequent requests -- List API, Getfiles API, etc.
This should fix your problem!

how to bypass Access-Control-Allow-Origin?

I'm doing a ajax call to my own server on a platform which they set prevent these ajax calls (but I need it to fetch the data from my server to display retrieved data from my server's database).
My ajax script is working , it can send the data over to my server's php script to allow it to process.
However it cannot get the processed data back as it is blocked by "Access-Control-Allow-Origin"
I have no access to that platform's source/core. so I can't remove the script that it disallowing me to do so.
(P/S I used Google Chrome's Console and found out this error)
The Ajax code as shown below:
$.ajax({
type: "GET",
url: "http://example.com/retrieve.php",
data: "id=" + id + "&url=" + url,
dataType: 'json',
cache: false,
success: function(data)
{
var friend = data[1];
var blog = data[2];
$('#user').html("<b>Friends: </b>"+friend+"<b><br> Blogs: </b>"+blog);
}
});
or is there a JSON equivalent code to the ajax script above ? I think JSON is allowed.
I hope someone could help me out.
Put this on top of retrieve.php:
header('Access-Control-Allow-Origin: *');
Note that this effectively disables CORS protection, and leaves your users exposed to attack. If you're not completely certain that you need to allow all origins, you should lock this down to a more specific origin:
header('Access-Control-Allow-Origin: https://www.example.com');
Please refer to following stack answer for better understanding of Access-Control-Allow-Origin
https://stackoverflow.com/a/10636765/413670
Warning, Chrome (and other browsers) will complain that multiple ACAO headers are set if you follow some of the other answers.
The error will be something like XMLHttpRequest cannot load ____. The 'Access-Control-Allow-Origin' header contains multiple values '____, ____, ____', but only one is allowed. Origin '____' is therefore not allowed access.
Try this:
$http_origin = $_SERVER['HTTP_ORIGIN'];
$allowed_domains = array(
'http://domain1.com',
'http://domain2.com',
);
if (in_array($http_origin, $allowed_domains))
{
header("Access-Control-Allow-Origin: $http_origin");
}
I have fixed this problem when calling a MVC3 Controller.
I added:
Response.AddHeader("Access-Control-Allow-Origin", "*");
before my
return Json(model, JsonRequestBehavior.AllowGet);
And also my $.ajax was complaining that it does not accept Content-type header in my ajax call, so I commented it out as I know its JSON being passed to the Action.
Hope that helps.
It's a really bad idea to use *, which leaves you wide open to cross site scripting. You basically want your own domain all of the time, scoped to your current SSL settings, and optionally additional domains. You also want them all to be sent as one header. The following will always authorize your own domain in the same SSL scope as the current page, and can optionally also include any number of additional domains. It will send them all as one header, and overwrite the previous one(s) if something else already sent them to avoid any chance of the browser grumbling about multiple access control headers being sent.
class CorsAccessControl
{
private $allowed = array();
/**
* Always adds your own domain with the current ssl settings.
*/
public function __construct()
{
// Add your own domain, with respect to the current SSL settings.
$this->allowed[] = 'http'
. ( ( array_key_exists( 'HTTPS', $_SERVER )
&& $_SERVER['HTTPS']
&& strtolower( $_SERVER['HTTPS'] ) !== 'off' )
? 's'
: null )
. '://' . $_SERVER['HTTP_HOST'];
}
/**
* Optionally add additional domains. Each is only added one time.
*/
public function add($domain)
{
if ( !in_array( $domain, $this->allowed )
{
$this->allowed[] = $domain;
}
/**
* Send 'em all as one header so no browsers grumble about it.
*/
public function send()
{
$domains = implode( ', ', $this->allowed );
header( 'Access-Control-Allow-Origin: ' . $domains, true ); // We want to send them all as one shot, so replace should be true here.
}
}
Usage:
$cors = new CorsAccessControl();
// If you are only authorizing your own domain:
$cors->send();
// If you are authorizing multiple domains:
foreach ($domains as $domain)
{
$cors->add($domain);
}
$cors->send();
You get the idea.
Have you tried actually adding the Access-Control-Allow-Origin header to the response sent from your server? Like, Access-Control-Allow-Origin: *?

Categories

Resources