AJAX: problem with redirect - javascript

This is the first time ever I'm using AJAX, and I want to do the following on an otherwise static page www.xyz.org/some_site.html:
Send a GET request to another url "www.xyz.org/testscript"
if response has either status code != 200 or content != 'ok': do nothing
else: include sth on the website (i.e. set style="display:block" on an element that previously had "display:none")
I've implemented that successfully using basic AJAX. But:
There is an Apache redirect installed pointing from www.xyz.org/testscript to subdomain.xyz.org/testscript, the URL where the actual testscript lives (as AJAX doesn't support cross-domain calls even to subdomains afaik).
When I call www.xyz.org/testscript I get a 302 status code, and the content says "The document has moved here: subdomain.xyz.org/testscript".
How can I grab the 'final' return value?
I guess/hope any AJAX expert can give me a one-liner to solve that ...

AJAX (or XMLHttpRequest to be acurate) won't be tricked by a redirect. To be able to get content from another domain you need to use a proxy on the server. The following is a simple PHP proxy:
if(strpos($_GET['q'], "http://") === 0){
echo file_get_contents($_GET['q']);
}
use it like this:
xhr.open(GET, "www.xyz.org/proxy.php?q=subdomain.xyz.org/testscript", true);

The answer is, according to the comments above:
It's not possible to achieve what I want to do, as AJAX can't be tricked into following a redirect.
EDIT: I tried to solve it by adding another javascript file at subdomain.xyz.org/another.js and throwing all AJAX code from my static html site into it.
Then, on the static html site, I included this script with an ordinary
<script src="subdomain.xyz.org/another.js">
tag. But that wouldn't work either ... cheated myself: Including the javascript on my static page results in the original problem again (cross-domain calls forbidden).

Related

block direct access to file but allow access through jquerys load function

I'm using jQuery to display a certain page to a user through it's .load() function. I am doing this to allow user customization to the website, allowing them to fit it to their needs.
At the moment, I am trying to display the file feed.php inside of a container within main.php;
I have come across a problem where I would like to prevent direct access to the file (i.e: going directly to the path of it (./feed.php)), but still allowing it to be served through the .load() function.
If I use the .htaccess deny from all method for this, I get a 403 on that specific part of the page. I can't find any other solution to this problem; disallowing me to achieve what I want.
This is my current (simplified) script and html:
<script type="text/javascript">
$("#dock-left-container").load("feed.php"); // load feed.php into the dock-left-container div
</script>
<div class="dock-leftside" id="dock-left-container"></div> // dock-left-container div
If anyone could suggest a solution through .htaccess, php, or even a completely different way to do this, I'd be very grateful!
Thanks in advance.
Please follow below steps to achieve:
In the .load function of jquery post a security code.
In the Feed.php page place a PHP condition if the posted security_code params found and match with security_code passed in the .load then only allow to access the page otherwise restrict.
Please follow below changes in your existing code to achieve it.
JS
<?php
$_SESSION['security_code'] = randomCode();
?>
<script type="text/javascript">
$("#dock-left-container").load("feed.php", {
security_code: '<?= $_SESSION['security_code']; ?>'
}); // load feed.php into the dock-left-container div
</script>
PHP
Place php condition in the top of feed.php
if(isset($_POST['security_code']) && $_POST['security_code'] == $_SESSION['security_code']){
//Feed.php page's all the stuff will go here
}else{
echo "No direct access of this page will be allowed.";
}
feed.php:
if (isset($_SERVER['HTTP_X_REQUESTED_WITH']) && strtolower($_SERVER['HTTP_X_REQUESTED_WITH']) == 'xmlhttprequest') {
readfile('myfeed.xml');
} else {
header('HTTP/1.0 403 Forbidden');
}
jQuery sends a HTTP_X_REQUESTED_WITH header by default. This is not, by far, anything remotely secure since HTTP headers are easily sent/spoofed. But it will stop the occasional user trying to access the feed directly.
You can, additionaly, check the $_SERVER['HTTP_REFERER'] header (but, again, this is easily spoofed) and, ofcourse, use your normal session logic to make sure the user is logged on if that's a requirement to access the feed.
Either way: there's no way to make this 'water tight'. If your browser can (should be able to) access the feed in some way then it's simply a matter of opening the debugger, having a look at the actual request sent in the network tab and sending the exact same headers/request to get to the file from, say, Curl. Actually, you will see the response of the request (i.e. the actual feed) in the debugger as well.
Repeat after me: if my (or a user's) browser can access the feed 'from jQuery' (via an AJAX request or whatever) then the feed is accessible to that user if he's even just a little bit more persistent than giving up immediately. Only using a session will keep out 'unauthorized' users because it relies on being logged in. After having logged in the request is visible no matter what and that request can be 'forged' to be sent from any other application no matter what.

Hide iframes if not correctly loaded

I have several iframes pointing to external websites on my page. In case those services are interrupted or changed, I would like to hide those iframes instead of displaying an error message on my page.
Is there any way to find out in Javascript if the iframe has been loaded correctly?
I added a class to hide the iframe and then remove it with jQuery when the iframe is ready, like this:
$('#widget').ready(function () {
$('#widget').removeClass('hidden');
});
It still removes the hidden class when I put an invalid URL in the iframe src, showing the error iframe.
My questions are two:
How can I make the function just run if the iframe has been loaded correctly?
Instead of using $('#widget').ready, I would like to use $('iframe').ready to target all iframes at once; if I do so, how to refer to the specific iframe loaded, inside the function?
Thanks!
Your question can be boiled down to :
how can I check if an URL exists and the website is alive from Javascript ?
The answer is splitted in:
For internal URLs, use AJAX and check the response code: if it's 2xx or 3xx (eg 200 or 302), it's fine. If it's 4xx or 5xx (eg. 404 or 500) it's bad. Read more on a similar answer.
For external URLs, you can't do it due to a security measure called same-origin policy.
Since it seems you are pointing to external URLs, here is my suggestion:
Create a server-side component (a Servlet, a RESTful WebService, a Struts2 action, etc... whatever you prefer, according to the server side technology you are using) that perform the check for you, and return a streamed response with the data (if any) and the HTTP response code that you can check for errors. Then from the <iframe>s call your component URL.
You probably can't do exactly what you want using Javascript, you need server-side things, PHP for instance.
However, here are some helpful places to look for more info:
You can run code after an iframe loads with this JQuery function:
$('#myIframe').load(function(){
//your code (will be called once iframe is done loading)
});
Look at this Stack Overflow question and answers about iframe loading.
To target the specific element inside the function, do:
$('iframe').aFunction(function() { this.doSomething(); } );
Look at this Stack Overflow question and answers about "this".

PHP HttpRequest to create a web page - how to handle long response times?

I am currently using javascript and XMLHttpRequest on a static html page to create a view of a record in Zotero. This works nicely except for one thing: The page html title.
I can of course also change the <title>...</title> tag, but if someone wants to post the view to for example facebook the static title on the web page will be shown there.
I can't think of any way to fix this with just a static page with javascript. I believe I need a dynamically created page from a server that does something similar to XMLHttpRequest.
For PHP there is HTTPRequest. Now to the problem. In the javascript version I can use asynchronous calls. With PHP I think I need synchronous calls. Is that something to worry about?
Is there perhaps some other way to handle this that I am not aware of?
UPDATE: It looks like those trying to answer are not at all familiar with Zotero. I should have been more clear. Zotero is a reference db located at http://zotero.org/. It has an API that can be used through XMLHttpRequest (which is what I said above).
Now I can not use that in my scenario which I described above. So I want to call the Zotero server from my server instead. (Through PHP or something else.)
(If you are not familiar with the concepts it might be hard to understand and answer the question. Of course.)
UPDATE 2: For those interested in how Facebook scraps an URL you post there, please test here: https://developers.facebook.com/tools/debug
As you can see by testing there no javascript is run.
Sorry, im not sure if i understand what you are trying to ask, are you just wanting to change the pages title?
Why not use javascript?
document.title = newTitle
Facebook expects the title (or opengraph :title tags) to be present when it fetches the page. It won't execyte any JavaScript for you to fill in the blanks.
A cool workaround would be to detect the Facebook scraper with PHP by parsing the User Agent string, and serving a version of the page with the information already filled in by PHP instead of JavaScript.
As far as I know, the Facebook scraper uses this header for User Agent: "facebookexternalhit/1.1 (+http://www.facebook.com/externalhit_uatext.php)"
You can check to see if part of that string is present in the header and load the page accordingly.
if (strpos($_SERVER['HTTP_USER_AGENT'], 'facebookexternalhit') !== false)
{
//synchronously load the title and opengraph tags here.
}
else
{
//load the page normally
}

Loading external content into server on localhost

I am trying to create a web application that loads content dynamically. When I do this, of course I want to do the development locally, i.e. localhost. Some of the "functionality" is a form and when posting that form an e-mail is sent from the server. Because I want to access the servers e-mail functionality, I am linking that specific page to the server. But the problem is that it is not loaded.
In my script below it works, but if I change the comments so I am pointing to iandapp.com, than I just get empty string. It's exactly the same page, just copied it to the server.
$("#support").click(function () {
if(support_page==null){
//$("#section2").load("http://www.iandapp.com/smic/subscription_2.php", function(data) {
$("#section2").load("subscription_2.php", function(data) {
support_page = data;
});
}
The script is located inte the main page (index.html) and content should be loaded into a div with id="section2".
I know that (support_page==null) is true because I have a break point inside where it stops.
Please let me know what the probelm is and how I can fix it. I have been going on for hours trying to get this working.
Thanks in advance!
google about
cross domain ajax requests
. This is disabled in the browser level. There are ways to circumvent this, both client side and server side.
It probably has something to do with it being a cross-domain request. You could use what I consider to be a "hack", http://james.padolsey.com/javascript/cross-domain-requests-with-jquery/, but I.M.O. it's not worth it.
Have you considered sending through an SMTP server instead? If so, you'd have no problem with the file (sending the mail) being local.
And what about adding proper headers on server's http response to allow crossdomain ?
Access-Control-Allow-Credentials: true
Access-Control-Allow-Origin: *
Access-Control-Origin: *
Use .getJSON() instead of .load(), this method supports cross-domain requests. You'll need to make sure your PHP script does something like the following:
echo $_GET['callback'] . '(' . json_encode($results) . ')';
jQuery will append something like ?callback=callback0234 to the request url because it wants you to 'call' the callback function when your script returns. So the output of your script may look something like:
callback0234('mydata': '<p>This is my data</p>')

Script puzzle <script src="ajaxpage.php?emp_id=23" />?

Very simple Ajax request taking employee id and returning the user info as HTML dumb.
Request ajax("employee/info?emp_id=3543")
Response id = 3543name = some name
This is just another simple JS trick to populate the UI. However i do not understand how something like below is equally able to execute correctly and dump the HTML code.
<script type="text/javascript" src="employee/info?emp_id=3543" />
When page encounters following code it executes like the ajax request is executed and dumps code into page. Only difference is its no more asynchronous as in case of Ajax.
Questions :
Is this correct approach ? its +ves and -ves.
Which are the correct scenarious to user it?
Is this also means that any HTML tag taking "src" tag can be used like this?
I have used this kind of javascript loading for cross domain scripting. Where it is very useful. Here is an example to show what I mean.
[Keep in mind, that JS does not allow cross domain calls from javascript; due to inbuilt security restrictions]
On domain www.xyz.com there lies a service that give me a list of users which can be accessed from http://xyz.com/users/list?age=20
It returns a json, with a wrapping method like following
JSON:
{username:"user1", age:21}
If I request this json wrapped in a method like as follows:
callMyMethod({username:"user1", age:21})
Then this is a wrapped json which if loads on my page; will try to invoke a method called callMyMethod. This would be allowed in a <script src="source"> kind of declaration but would not be allowed otherwise.
So what I can do is as follows
<script language="javascript" src="http://xyz.com/users/list?age=20"></script>
<script language="javascript">
function callMyMethod(data)
{
//so something with the passed json as data variable.
}
</script>
This would allow me to stuff with JSON coming from other domain, which I wouldn't have been able to do otherwise. So; you see how I could achieve a cross domain scripting which would have been a tough nut to crack otherwise.
This is just one of the uses.
Other reasons why someone would do that is:
To version their JS files with
releases.
To uncache the js files so that they are loaded on client as soon as some changes happen to js and params being passed to URL will try to fetch the latest JS. This would enable new changes getting reflected on client immediatly.
When you want to generate conditional JS.
The usage you have specified in example wouldn't probably serve much purpose; would probably just delay the loading of page if processing by server takes time and instead a async ajax call would be much preferred.
Is this correct approach ? its +ves
and -ves.
Depends whether you want to use asynchronous (ajax) way or not. Nothing like +ve or -ve.
The later method takes more time though.
Which are the correct scenarious to
user it?
Ajax way is the correct method there in that sense.
Is this also means that any HTML tag
taking "src" tag can be used like
this?
src is used to specify the source path. That is what it is meant to do.

Categories

Resources