phonegap jquery cross domain html - javascript

at first: sorry for my bad english...
also im new in js, css, html, jquery, phonegap etc.
Cross domain should not be a problem because PhoneGap is a file base (file:///) where such restriction doesn't apply, right?
i know, that i need to whitelist the server in my config.xml like here
http://docs.build.phonegap.com/en_US/configuring_access_elements.md.html#Access%20Elements
but what is the best way to do it? A simple httprequest? A Plugin? ajax ?(I've haerd that ajax isnt good at file:/// requests).... i shall not
use
a proxy
JSONP
CORS (Cross-origin ressource sharing)
here you can see how the html code from the server will look like:
<html>
<head>
<title>Memorycards</title>
<link rel="stylesheet" href="/stylesheets/style.css">
</head>
<body>
<h1>Memorycards</h1>
<p>here are all Memorycards
<li>memory1</li>
<li>memory2</li>
</p>new Memorycard <br>
</body>
</html>
i want to put the names (memory1, memory2) in a list and the links (link1, link2) into another list.
Hope you guys can help me!

Something I've done in the past in ASP.NET is wrap all external json/xml requests with a local resource (like a proxy). It makes the connection and simply returns the result which then you can use any javascript based application and bypass using jsonp.
Javascript <-> Local App <-> Remote Resource
For example, lets say you need to access the results on http://remotehost.com/app.php. What you can do is write a local application that gets deployed with your javascript (let's call it remoteapp.php) that simply calls this resource and echos the response. You then simply call remoteapp.php via your javascript using a simple ajax GET and you have the results of http://remotehost.com/app.php.
Given your limitations of not being able to use CORS and probably not wanting to rely on another application for XSS (corsproxy.com) this seems like your best bet.

Related

How to load a cross-domain page using JavaScript

I have a page on my site (let's say on domain A) and I would like to pull in some more content into it from another page, say, on domain B. As a default, this functionality is blocked by the browsers for security reasons.
As far as I've found, there are a few ways to do this.
CORS: As I understand, this method requires contributions from both the server and the client. The server needs to add a header to its response (i.e. Access-Control-Allow-Origin: [DOMAINS], as of http://enable-cors.org/server.html). On the other hand, the client needs to adjust their requests (e.g. http://www.html5rocks.com/en/tutorials/cors/).
If using jQuery, there is this small plug-in which uses the YahooAPI (i.e. http://james.padolsey.com/snippets/cross-domain-requests-with-jquery/). The advantage of this is that the client can use it on its own to get pages from other domains. The catch is that Yahoo limits the number of requests per hour per IP, and for commercial use Yahoo's permission is needed.
I've also read about JSONP but I haven't done much digging.
My question is: are there other possibly better options that I might be overlooking?
For the record, the site I'm working with is a huge commercial site with millions of users every day.
You can do JSONP, permit CORS and use plain JSON, use a DIY JSONP wrapper, or use a JSONP Proxy service. Here are the solutions in detail: JSONP with remote URL does not work
The easiest option in your situation is to roll your own JSONP proxy service. Here's a demo barebones PHP wrapper to get past CORS if you fetch a JSON string. No catch, no limits unlike Yahoo's YQL.
<?php
$callback = isset($_GET["callback"]) ? $_GET["callback"] : "?";
$json = file_get_contents('http://somedomain.com/someurl/results.json');
header('Access-Control-Allow-Origin: *');
header("Content-type: application/json");
echo $callback . "(" . $json . ");";
?>
Are you trying to get content, or code? If you're trying to get content, is it possible to just use an iframe?
If you want code, I think the options you outlined are pretty much what you have available. JSONP might be your best bet due to browser support. For example, IE only supported it as of version 10. If you're on a site with millions of users per day, my guess is there are some folks on older versions of IE (unfortunately).
Edit: Depending on the content, another option is to introduce your own local proxy. For example, I've done things where I need to call WebServiceX on some other provider. I call the WebServiceX in server side code and implement my own web service that my JavaScript accesses. This means I'm not going cross domain because the cross domain access happens server-side, not client-side. It also allowed me to introduce caching and other things (depending on the type of data) that improved performance.
Approach for cross domain data passing - create JavaScript object and assign source from another domain. Here is quick and dirty example:
File test.html:
<html>
<body>
Test done
</body>
<script>
var s = document.createElement("script");
s.type='text/javascript';
s.src='test.js';
document.body.appendChild(s);
</script>
</html>
and test.js
abc={a:'A',b:'B',c:'C'};
alert(abc.a);
test.js could be in any domain and function alert() could be any function.
I have more elegant ways to attach or run such approach but this one is sufficient enough to undersatnd the idea.

How do you get content from another domain with .load()?

Requesting data from any location on my domain with .load() (or any jQuery ajax functions) works just fine.
Trying to access a URL in a different domain doesn't work though. How do you do it? The other domain also happens to be mine.
I read about a trick you can do with PHP and making a proxy that gets the content, then you use jQuery's ajax functions, on that php location on your server, but that's still using jQuery ajax on your own server so that doesn't count.
Is there a good plugin?
EDIT: I found a very nice plugin for jQuery that allows you to request content from other pages using any of the jQuery function in just the same way you would a normal ajax request in your own domain.
The post: http://james.padolsey.com/javascript/cross-domain-requests-with-jquery/
The plugin: https://github.com/jamespadolsey/jQuery-Plugins/tree/master/cross-domain-ajax/
This is because of the cross-domain policy, which, in sort, means that using a client-side script (a.k.a. javascript...) you cannot request data from another domain. Lucky for us, this restriction does not exist in most server-side scripts.
So...
Javascript:
$("#google-html").load("google-html.php");
PHP in "google-html.php":
echo file_get_contents("http://www.google.com/");
would work.
Different domains = different servers as far as your browser is concerned. Either use JSONP to do the request or use PHP to proxy. You can use jQuery.ajax() to do a cross-domain JSONP request.
One really easy workaround is to use Yahoo's YQL service, which can retrieve content from any external site.
I've successfully done this on a few sites following this example which uses just JavaScript and YQL.
http://icant.co.uk/articles/crossdomain-ajax-with-jquery/using-yql.html
This example is a part of a blog post which outlines a few other solutions as well.
http://www.wait-till-i.com/2010/01/10/loading-external-content-with-ajax-using-jquery-and-yql/
I know of another solution which works.
It does not require that you alter JQuery. It does require that you can stand up an ASP page in your domain. I have used this method myself.
1) Create a proxy.asp page like the one on this page http://www.itbsllc.com/zip/proxyscripts.html
2) You can then do a JQuery load function and feed it proxy.asp?url=.......
there is an example on that link of how exactly to format it.
Anyway, you feed the foreign page URL and your desired mime type as get variables to your local proxy.asp page. The two mime types I have used are text/html and image/jpg.
Note, if your target page has images with relative source links those probably won't load.
I hope this helps.

How to get the content of a remote page with JavaScript?

I have a URL of a remote page from a different domain which I have to download, parse, and update DOM of the current page. I've found examples of doing this using new ActiveXObject("Msxml2.XMLHTTP"), but that's limited to IE, I guess, and using new java.net.URL, but I don't want to use Java. Are there any alternatives?
Same domain policy is going to get you.
1) Proxy through your server. browser->your server->their server->your server->browser.
2) Use flash or silverlight. The 3rd party has to give you access. The bridge between javascript and flash isn't great for large amounts of data and there are bugs. Silverlight isn't ubiquitous like flash...
3) use a tag. This really isn't safe... Only works if 3rd party content is valid javascript.
Whats about load an PHP Script via AJAX which does file_get_contents() ? This should work for different domain. If i understand correct.
Writing a server-side script that will retrieve the page's content for you is the way to go. You can use the XMLHttpRequest object to make an AJAX call to that script, which will just put through all html (?) for you.
Still, I advise against it. I don't know exactly how much you trust the other site, but the same origin policy exists for a reason. What is it exactly you are trying to do? Usually, there is a workaround.
I dont think you can do this according to the constraints of same origin policy. Two communicate between two domains using Iframes also we can use JS code but both domains need to have communicating code in them. The Child frame can contact the grandparent frame (window) but not here.
Since you are referring to some other url all togeather.
The only way is to do it using your server side code to access the content on the other domain.
Just use PHP:
<?php
$url = "http://www.domaintoretrieve.com";
ob_start();
include_once( $url );
$html = ob_get_contents();
ob_end_clean();
?>
$html contains the entire page to manipulate as needed.
The XMLHTTPRequest object is common to most modern browsers and is what powers AJAX web applications.

How does Google Friend Connect accomplish cross domain communication without needing to upload a file to the client domain?

Previously, Google's Friend Connect required users to upload a couple of files to their websites to enable cross domain communication and Facebook Connect still requires you to upload a single file to enabled it.
Now, Friend Connect doesn't require any file upload... I was wondering how they were able to accomplish this.
Reference:
http://www.techcrunch.com/2009/10/02/easy-does-it-google-friend-connect-one-ups-facebook-connects-install-wizard/
There are multiple methods of communicating between documents on different domains, amongst these HTML5 postMessage, NIX, FIM(hash/fragment), frameElement and by using the window.name property.
These are available on different browsers and in different versions, but collectively they allow you to do reliable XDM (cross domain messaging).
One project that have done this earlier is Apache Shindig, which probably pioneered quite a few of these, and more recently, the project easyXDM has come, unifying all of these approaches with a common API, making it easy to create complex applications using XDM and RPC.
You can read in depth about the various methods of transporting the data in this article at Script Junkie.
Now, to answer your question directly, earlier on it was quite common to believe that there was only postMessage, the FIM (Fragment Identifier Messaging) available, and for the latter to work efficiently, one often had to upload a special file to your domain. As more methods have been discovered, this has by many been deprecated as a technique, and hence; no more need for the file.
Just for the record; I'm the author of both the Script Junkie article, and the easyXDM library (that is what Twitter, Disqus and quite a few more are using by the way).
<edit>It's difficult to remember/verify now, but I believe my answer here was probably incorrect. Sean Kinsey's answer above should be the definitive answer to this question. If you're reading this, please upvote his answer and ignore mine.</edit>
The Google Friend Connect widget works like most ads/gadgets do, using a copy/pasted snippet of HTML to reference a JavaScript include on the host's server which then creates an iframe containing the desired content. By opening the iframe with your site ID in the URL, Google's server is able to generate the appropriate HTML document to represent a Friend Connect gadget for your particular site/settings.
There isn't any cross-site communication happening beyond that initial step of creating an iframe with the appropriate URL target. Everything inside the gadget's dynamically generated iframe is more like the user visited a separate page on Google's server, but what would have been displayed is then embedded/isolated in a block on your page instead.
I'm not sure how it works in this particular instance but cross-domain messaging can be accomplished either by the postMessage() API or by changing the hash part of the URL and monitoring that.
The hash change method works because both the enclosing and the enclosed pages have access to the enclosed page's URL.
Of course, hopefully the postMessage() API call becomes more standard over time.
JSON allows cross-domain javascript.
Due to browser security restrictions,
most "Ajax" requests are subject to
the same origin policy; the request
can not successfully retrieve data
from a different domain, subdomain,
or protocol.
Script and JSONP
requests are not subject to the same
origin policy restrictions.
There is no other method than using the somewindow.postMessage(); for communication between cross-domain iframes.
Before somewindow.postMessage() you had to upload file in order to ensure that you can establish communication between iframes.
example:
<html>
<!-- this is main domain www.example.com -->
<head>
</head>
<body>
<iframe src="http://www.exampleotherdomain.com/">
<script>
function sendMsg(a) {
var f = document.createElement('iframe'),
k = document.getElementById('ifr');
f.setAttribute('src', 'http://www.example.com/xdreciver.html#myValueisSent');
k.appendChild(f);
k.removeChild(f);
}
</script>
<div id="ifr"></div>
</iframe>
</body>
</html>
now the http://www.example.com/xdreciver.html html content :
<html>
<!-- this is http://www.example.com/xdreciver.html -->
<head>
<script>
function getMsg() {
return window.location.hash;
}
</script>
</head>
<body onload="var msg = getMsg(); alert(msg);">
</body>
</html>
As for using the .postMessage(); its enough to use top.postMessage('my message to other domain document, which is also the main document', 'http://www.theotherdomain.com');

Ajax page part load and Google

I have some div on page loaded from server by ajax, but in the scenario google and other search engine don't index the content of this div. The only solution I see, it's recognize when page get by search robot and return complete page without ajax.
1) Is there more simple way?
2) How distinguish humans and robots?
You could also provide a link to the non-ajax version in your sitemap, and when you serve that file (to the robot), you make sure to have included a canonical link-element to the "real" page you want users to see:
<html>
<head>
[...]
<link rel="canonical" href="YOUR_CANONICAL_URL_HERE" />
[...]
</head>
<body>
[...]
YOUR NON_AJAX_CONTENT_HERE
</body>
</html>
edit: if this solution is not appropriate (some comments below points out that that this solution is non-standard and only supported by the "big-three"), you might have to re-think whether you should make the non-ajax version the standard solution, and use JavaScript to hide/show the information instead of fetching it via AJAX. If it is business critical information that is fetched, you have to realize that not all users have JavaScript enabled, and thus they won't be able to see this information. A progressive enhancement approach might be more appropriate in this case.
Google gets antsy if you are trying to show different things to you users than to crawlers. I suggest simply caching your query or whatever it is that needs AJAX and then using AJAX to replace only what you need to change. You still haven't really explained what's in this div that only AJAX can provide. If you can do it without AJAX then you should be, not just for SEO but for braille readers, mobile devices and people without javascript.
You can specify a sitemap in your robots.txt. That sitemap should be a list of your static pages. You should not be giving to Google a different page at the same URL, so you should have a different URL with static and dynamic content. Typically, the static URL is .../blog/03/09/i-bought-a-puppy and dynamic URL is something like .../search/puppy.

Categories

Resources