Trace URL's that link to my iframe/widget - javascript

I have created a widget.html page with, for example, a "Powered by Example.com" box/widget. And I have an HTML iframe that links to that specific page (widget.html) on my site.
<iframe src="http://example.com/widget.html"></iframe>
I share that iframe code with website owners who want to use my widget on their sites.
I want to be able to see every single site that uses my iframe. I would prefer a code that creates a txt file or even a MySQL Table with all websites URLs that use my widget on their websites.
I basically want to track the sites that use my widget as an iframe . How do I do that? With Javascript? PHP? MySQL?
P.S. I'm not sure if an iframe is the best way to link widgets off my site, but I'm open for your suggestions. Thanks in advance.

use jquery
then load a request page throgh jquery like : $("#div1").load("demo_test.txt");
and send a request uri parameter to it
you will find the current url using the widget and alos you can get the parameter

Related

how to youtube thumbnails link replace with site link

I have problem youtube video and thumbnails block in our country and I want to run youtube api for getting these data , but problem is thumbnails links show like these link
I want to its show through my site link with my server like
https://img.example.com/vi/T0Jqdjbed40/default.jpg
I try to replace url with string replace function but its not work please tell me how to run through my site link like some site show like playit , vube.pk , flix
As a purely theoretical thing, you can replace sections of a string using the str_replace function:
str_replace(
"https://i.ytimg.com/vi/T0Jqdjbed40/default.jpg",
"://i.ytimg.com", "://img.example.com");
This will keep the http:// or https:// protocol in place.
For your use case, again purely theoretically, you would want to download the resource yourself, and then provide it to the end user. To get the data using file_get_contents for example:
$data = file_get_contents('http://www.example.com/');
You culd then potentially embed the image data directly into the page itself - see this SO question: Can I embed a .png image into an html page?
You should probably check youtube's terms of service before attempting anything...

Trouble in scraping from a page

Refering to the one of my previous question, I have to scrape reviews(all reviews) of a hotel, for example this hotel
With using BeautifulSoap, what I have done that I first get all the review pages links from pagination within the div having class BVRRPager BVRRPageBasedPager, and then scrape reviews from all pages.
Problem with BeautifulSoap is that the content in div.BVRRRatingSummary does not come along(try loaing that page with JS disabled)
I have scraped the reviews using Selinium but my client does not want to use Selinium because it loads full page with JS and images
I want to know that what kind of process they might be using to load review? And is there any way I can scrape the content in div.BVRRRatingSummary with BeautifulSoap?
You could try using firefox with the firebug addon. Open up firebug when loading the webpage and go to Net and then click on XHR. That will show you which json files are being loaded. You can then try to get those files directly and work with those using a library like simplejson.

Capturing a part of web page for mobile devices

I have an android app where I want to show a page to users inside the webview but the problem I am facing is that I can't use the web page as it is because the page is not responsive to mobile devices and user needs to scroll horizontally and vertically a lot. The web page is:
http://www.ielts.org/test_centre_search/search_results.aspx
I just need the drop down search functionality from that page. I tried copying the html source code on my local to replicate the page but the since the html form's action has to be http://www.ielts.org/test_centre_search/search_results.aspx for fetching the results, when I select an option on my local version, it goes to the http://www.ielts.org/test_centre_search/search_results.aspx url and displays their version of page next time.
I came across this page:
http://www.ieltsessentials.com/test_centre_search.aspx
which is implementing the same functionality. How can I replicate the same and add it inside local .html document
i think the easiest way to implement this will be to inject your own css style into their html, and hide/restyle the elements that are not responsive. that way you don't have to analyze any of the logic that they have, as it will be safely on css level.
the only thing you have to figure out is how to re-inject your css into the web view after the page is reloaded. there's actually a way to do that by simply injecting a javascript call into their page like here https://stackoverflow.com/a/5010864/467198
to detect that page is reloaded i think you can use onPageFinished
you could use asp to proxy the page you want to canibalize and then in jQuery you could traverse that proxy'ed page and pull out the pieces you want to use and then create your new, responsive doc from items scraped from the original page.
i'm not an asp.net developer so i've used php in my example. here's a link to an example of how asp.net could be used to proxy a page
Simplest Possible ASP .NET AJAX Proxy Page
<?php echo file_get_contents( $_GET['u'] );
then in jQuery use $.ajax() to read the proxy'ed page as HTML and scrape the page as needed
<script>
$(function(){
$.ajax({
url:'proxy.php?u=http://www.ielts.org/test_centre_search/search_results.aspx',
dataType:'html',
success:function(data){
console.log($('#header',data));
}
})
});
</script>
in this example i'm just reading the contents of the #head but you could scrape whatever you need from the original page and then inserting them into your target dom or pass them to a template. to get what you're looking for you'd use '#Template_TestCentreSearch1_SearchTable' where i use '#head' to retrieve your drop down markup

How does facebook comments plugin works?

I would like to understand how the facebook comment plugin, that is inserted inside a web page, using a javascript script and a div tag, works.
If I try to guess (and using firebug), the javascript part loads an iframe that is inserted inside the div part.
My problem, is that everybody can put a comment feed that is not necessarily ralative to the web page content! Suggest I'm example.com, how can I be sure that example.org is not using my example.com facebook comments?
Does Disqus works similarily?
Thanks in advance.
The facebook url that is loaded within the IFRAME has access to the REFERRER URL (which is the loading page in this case) and uses it to check the domain.

Show div from another website

Is there a way to use an iframe or some other method of showing a named div from another website?
I want to pull in some data from a government website into a google map and when they click the point I want the information from one of the divs on that page to display.
Using JQuery, you should be able to exactly that with the load-function.
Here is a small example to get a container with id "container" on a page called Test.html:
$('#contentDiv').load('/Test.html #container');
You can visit the JQuery documentation here for more info.
I take assumption that you are sure of div's ID in that other website.
If yes. use Jquery Ajax to pull the site's content into a hidden iframe in your site. then fetch the content of the div-in-question into some variable and then you can use it for your purpose (parse html table data or do whatever)
Discard the iframe's content so that you don't have unnecessary items in your page's DOM.
Ajax Call
In-House Service to Scrape the HTML from the page
Select the div with xpath / SGML parser
Return to ajax call-handler
Replace the content of your div
However There are other problems, i.e. scraping someone's site for their content without their permission is BAD.
They may or may not care, but still you should seek permission, or one day you could find your webserver blacklisted from their site. Or worse... Especially a government site.
You should probably go about figuring out how to properly obtain the data you need (perhaps there's an api somewhere) and then render your own version.
You would have to make use of either JSONP or a middle-agent to retrieve the data (i.e. a PHP script using the CURL library).
JSONP functionality is included in numerous Javascript libraries such as MooTools and jQuery. That is the method I would use.
MooTools: http://mootools.net/docs/more/Request/Request.JSONP
jQuery: http://docs.jquery.com/Release:jQuery_1.2/Ajax
Once you have retrieved the body of the page the DIV resides on, you could use REGEX to extract the information from the specific DIV element.

Categories

Resources