I have been reading up on here about how to hide the source URL in an iFrame using Javascript and Jquery, but for some reason, I haven't found a solution that works so far with Wordpress.
So short and sweet, I am trying to hide a URL for competition reasons, I'm about to launch a new web page using an iFrame to display content from another website that I subscribe to and pay quite a bit of money for their services. So I don't want my competition to know about them and copy what I do.
What would be the best way that I could mask the HTML source link in the iFrame? I use the latest version of Wordpress to host my site.
Alternatively, I am open to more advanced solutions.
Thanks in advance.
I'm not stealing anything. I'm a businessman who pays taxes and I pay $1400 per year for the web service that I use and I don't want my competitors STEALING MY set up
Ok I'll take you at face value. As has been pointed out in comments, it's impossible to do this with plain HTML or Javascript. It simply can not be done. There is no way to prevent a user of a browser from seeing the URI's that are contacted. Period.
That being said you could:
Retrieve the HTML from the 3rd party with your web server using something like CURL in PHP.
Take the result of the above and send it out on a URL of your choosing.
Create HTML with an <iframe> or whatever else you want to display the content from the 3rd party and point it at your new URL on your web server.
The above is not trivial to do correctly and will require some knowledge and work on your part. One issue with the above method will be: passing along javascript, images, css, etc... correctly so that they appear to come from your server.
Related
I'm looking to get some data from a Facebook page of a restaurant, but I'm kinda stuck. I want to load some divs from the facebook page of the restaurant, then get the ID's of the divs, since they contain some information i would like to use. Ive tried using the .load function from jQuery, but no luck. Of all the answers I've seen, they all include a url that's something like somefile.html. Is it possible to load the divs ids and some innerHTML from a live page like Facebook? Are they somehow downloading the html to a file then using that? Keep in mind I know nothing about PHP, so any solutions? Thanks!
The right way to do it would be though Facebooks graph API, take a look at this site and see if it offers the information you need https://developers.facebook.com/docs/graph-api/reference/page/
NOTE
As comments have pointed out, "web scraping" is FORBIDDEN on Facebook.com by Facebook policy. http://www.facebook.com/apps/site_scraping_tos_terms.php
Technically, I don't think this is possible with Facebook and just JavaScript.
In general, using just JavaScript, one solution would be to load the (external) site like FaceBook into an Iframe, and then grab all the DIVs and search the DOM that's loaded into the iFrame. However, I believe FaceBook (and other sites) set something called "CORS Request Headers" which prevent the page from loading into an iFrame -- also, as far as I know, this cannot be hacked around except to use another language to pull the HTML as a file (like with PHP).
Sources:
Facebook Forbids iFrames
JavaScript, Load Page in iFrame
How can I copy the source code from a website (with javascript)? I want to copy the text that is showing the temperature from this website: http://www.accuweather.com/
I want to copy only the number that is displaying the temperature. Is there a way of copying that exact line from source code on the website? I heard about html scraping. if not javascript, what would be simplest way of doing it? Just copying the temeprature, and displaying it on my webpage.
Well the way you could do something like that in a simple way by loading the site into a hidden HTML element via AJAX and then search DOM for the element you want.
There is also a jQuery command that allows that directly. It would be something like:
<div id='temp'></div>
<script>
$('div#temp').load('https://www.accuweather.com/ #popular-locations-ul .large-temp', { limit: 1 });
</script>
#popular-locations-ul .large-temp is a css locator for the specific elements that contain the temperature.
However for some time web has a security feature called CORS. To be able to load something from one site via AJAX, the target site has to allow CORS headers explicitly. In the case of this particular site, CORS headers aren't present in the site configuration, so that means that any connection that tries to load something via AJAX won't be allowed.
You can only use a command like the above mentioned in a site you control and that you specify to allow CORS headers or in a site who already has this specification.
But as people have told you that's not a good thing from the start due to web sites impermanent nature. Things change a lot. So even if you could get a value in the way I mentioned from some other site, sometime later, the site would change and your code would be broken.
The reason I answered is because you are just learning and need guidance and not trying to do 'serious work'. Serious work would be using an API as people told you.
An web api is a special url you access (something like https://www.accuweather.com:1234/api/temperature/somecity) normally with some kind of security and that responds with the result you need for the function you want. For this kind service CORS is allowed because you are accessing in a secure and 'official' way.
Hope I clarified a bit.
I have a difficult one
I am trying to show an external website within my page (currently using an iframe)
The trick is I want to be able to edit the site
e.g. I really want to show a preview of another website that I can modify.
Not sure if there is any tools out there, was thinking of using a jquery request to create a page within my domain that I can edit?
Any advice?
If that would be possible from within the browser, that would be a bug.
The way you should do it, is creating some sort of a proxy on your own domain/server, and fetch the site through there. Then you're allowed to access the iframe and edit whatever you like.
Of course, the site won't run like it normally would, since it can't use its own cookies, AJAX calls, etc.
I think this js library could inspire you something!
easyXDM
Okay, so I'm trying to figure something out. I am in the planing stages of a site, and I want to implement "fetch data on scroll" via JQuery, much like Facebook and Twitter, so that I don't pull all data from the DB at once.
But I some problems regarding the SEO, how will Google be able to see all the data? Because the page will fetch more data automatically when the user scrolls, I can't include any links in the style of "go to page 2", I want Google to just index that one page.
Any ideas for a simple and clever solution?
Put links to page 2 in place.
Use JavaScript to remove them if you detect that your autoloading code is going to work.
Progressive enhancement is simply good practise.
You could use PHP (or another server-side script) to detect the user agent of webcrawlers you specifically want to target such as Googlebot.
In the case of a webcrawler, you would have to use non-JavaScript-based techniques to pull down the database content and layout the page. I would recommended not paginating the search-engine targeted content - assuming that you are not paginating the "human" version. The URLs discovered by the webcrawler should be the same as those your (human) visitors will visit. In my opinion, the page should only deviate from the "human" version by having more content pulled from the DB in one go.
A list of webcrawlers and their user agents (including Google's) is here:
http://www.useragentstring.com/pages/Crawlerlist/
And yes, as stated by others, don't reply on JavaScript for content you want see in search engines. In fact, it is quite frequently use where a developer doesn't something to appear in search engines.
All of this comes with the rider that it assumes you are not paginating at all. If you are, then you should use a server-side script to paginate you pages so that they are picked up by search engines. Also, remember to put sensible limits on the amout of your DB that you pull for the search engine. You don't want it to timeout before it gets the page.
Create a Google webmaster tools account, generate a sitemap for your site (manually, automatically or with a cronjob - whatever suits) and tell Google webmaster tools about it. Update the sitemap as your site gets new content. Google will crawl this and index your site.
The sitemap will ensure that all your content is discoverable, not just the stuff that happens to be on the homepage when the googlebot visits.
Given that your question is primarily about SEO, I'd urge you to read this post from Jeff Atwood about the importance of sitemaps for Stackoverflow and the effect it had on traffic from Google.
You should also add paginated links that get hidden by your stylesheet and are a fallback for when your endless-scroll is disabled by someone not using javascript. If you're building the site right, these will just be partials that your endless scroll loads anyway, so it's a no-brainer to make sure they're on the page.
I'm just looking for clarification on this.
Say I have a small web form, a 'widget' if you will, that gets data, does some client side verification on it or other AJAX-y nonsense, and on clicking a button would direct to another page.
If I wanted this to be an 'embeddable' component, so other people could stick this on their sites, am I limited to basically encapsulating it within an iframe?
And are there any limitations on what I can and can't do in that iframe?
For example, the button that would take you to another page - this would load the content in the iframe? So it would need to exist outwith the iframe?
And finally, if the button the user clicked were to take them to an https page to verify credit-card details, are there any specific security no-nos that would stop this happening?
EDIT: For an example of what I'm on about, think about embedding either googlemaps or multimap on a page.
EDIT EDIT: Okay, I think I get it.
There are Two ways.
One - embed in an IFrame, but this is limited.
Two - create a Javascript API, and ask the consumer to link to this. But this is a lot more complex for both the consumer and the creator.
Have I got that right?
Thanks
Duncan
There's plus points for both methods. I for one, wouldn't use another person's Javascript on my page unless I was absolutely certain I could trust the source. It's not hard to make a malicious script that submits the values of all input boxes on a page. If you don't need access to the contents of the page, then using an iframe would be the best option.
Buttons and links can be "told" to navigate the top or parent frame using the target attribute, like so:
This is a link
<form action="http://some.url/with/a/page" target="_parent"><button type="submit">This is a button</button></form>
In this situation, since you're navigating away from the hosting page, the same-origin-policy wouldn't apply.
In similar situations, widgets are generally iframes placed on your page. iGoogle and Windows Live Gadgets (to my knowlege) are hosted in iframes, and for very good reason - security.
If you are using AJAX I assume you have a server written in C# or Java or some OO language.
It doesn't really matter what language only the syntax will vary.
Either way I would advise against the iFrame methods.
It will open up way way too many holes or problems like Http with Https (or vice-versa) in an iFrame will show a mixed content warning.
So what do you do?
Do a server-side call to the remote site
Parse the response appropriately on the server
Return via AJAX what you need
Display returned content to the user
You know how to do the AJAX just add a server-side call to the remote site.
Java:
URL url = new URL("http://www.WEBSITE.com");
URLConnection conn = url.openConnection();
or
C#:
HttpWebRequest req = (HttpWebRequest)HttpWebRequest.Create("http://www.WEBSITE.com");
WebResponse res = req.GetResponse();
I think you want to get away from using inline frames if possible. Although they are sometimes useful, they can cause issues with navigation and bookmarking. Generally, if you can do it some other way than an iframe, that is the better method.
Given that you make an AJAX reference, a Javascript pointer would probably be the best bet i.e. embed what you need to do in script tags. Note that this is how Google embed things such as Google Analytics and Google Ads. It also has the benefit of also being pullable from a url hosted by you, thus you can update the code and 'voila' it is active in all the web pages that use this. (Google usually use version numbers as well so they don't switch everyone when they make changes).
Re the credit card scenario, Javascript is bound by the 'same origin policy'. For a clarification, see http://en.wikipedia.org/wiki/Same_origin_policy
Added: Google Maps works in the same way and with some caveats such as a user/site key that explicitly identify who is using the code.
Look into using something like jQuery, create a "plugin" for your component, just one way, and just a thought but if you want to share the component with other folks to use this is one of the things that can be done.