Facebook like and share button slowing website - javascript

I have setup a dynamic competition page where the query string determines what content you see.
For example (http://nectarfinance.com.au/dc=korinadrogan will show Korina's content, while no query string will show generic head office content).
The site (as is) is loading slowly, and I know it is happening because of the Facebook 'like and share' dynamic Facebook scripts on the page.
I was wondering if there is anyway to minify these script into one? Or if there is anyone to increase the load time of these scripts? or reduce the size of these scripts?
I'm not sure how to work around it as the files are externally hosted by Facebook.
I'll post the GTMetrix report in the answer below, as I can't post two links.
Thanks for your help

Related

HTML template including

My question is for learning the most efficient way of including HTML templates in multiple HTML pages. Currently I'm trying to develop a web-site which has a lot of pages. Simple things like header&footer will be the same in all the pages. I wonder how can we implement these templates to the pages efficiently where google bot will see the page with all its content including templates.
Most of the CMS did use the PHP function include(), include_once(), or their variant require() and require_once(). But there are different way for using them.
Theorically, you can nest multiple sub-page this way : a main page calling a header, content, footer. The content page being just made of some include, to get i.e. homeslider page , corp overview page ... the corp overview itself being made of multiple page for each member ... to infinite.
It's up to you to find the good nesting level to achieve your goals, while staying simple.
In most classic cases, you can just set your page content (home,corp overview, team, contact) and include_once the header on top, the footer on bottom.
About Google bot, as users, it won't see the sub-page you are including this way, so there is little chance for your lone footer to be referenced.
To be sure, use robots.txt !

How can I break a third party iframe on my site, with code?

I am not a programmer.
Someone has scraped my site home page source code and placed their iframe over it, so that when the page is fetched it displays their content.
The iframe is not immediately apparent but it's there, just well hidden. These sites are all hosted on hacked servers running WordPress. They still display our site links and architecture that is being delivered by our server. There are currently over 160 such sites built using the same method.
I believe that they have disabled js so that may not be an option.
I know that we can break out of an iframe if it's our site in the frame.
Is there any way, either on the server side or on the page to break their iframe and force our page to the top?
If we can break it, then our code becomes worthless and with a bit of luck they may stop using it.
Update:
Just wanted to add a few points to anyone who has any ideas.
1, They already have the code, only things being served are the images and CSS files because they have only left those links in the page.
2, They are showing their site by floating it with a z-index on top of everything, which is why when you view src you see the site above and not the site that is floated in the iframe.
3, The iframe is visible if you inspect element with firefox and scroll to the top of the page you can see the iframe they are using.
Based upon the additions (currently in an answer), since they have your code there's not much you can do about breaking out of the iframe.
Depending upon your server environment you could try determining what page is requesting your images and CSS, and then display modified versions to those accessing the scraped versions. The key word for your searches is 'hotlinking.'
Possible modifications could include not serving the assets (images/CSS), or returning a CSS file that just does a display:none; on HTML elements to hide.
It might be a fool's errand, but trying to contact the hosts of the hacked servers might be a good idea, but I can't honestly say that it will get you very far, and might be a waste of time for the majority of them.

Trouble in scraping from a page

Refering to the one of my previous question, I have to scrape reviews(all reviews) of a hotel, for example this hotel
With using BeautifulSoap, what I have done that I first get all the review pages links from pagination within the div having class BVRRPager BVRRPageBasedPager, and then scrape reviews from all pages.
Problem with BeautifulSoap is that the content in div.BVRRRatingSummary does not come along(try loaing that page with JS disabled)
I have scraped the reviews using Selinium but my client does not want to use Selinium because it loads full page with JS and images
I want to know that what kind of process they might be using to load review? And is there any way I can scrape the content in div.BVRRRatingSummary with BeautifulSoap?
You could try using firefox with the firebug addon. Open up firebug when loading the webpage and go to Net and then click on XHR. That will show you which json files are being loaded. You can then try to get those files directly and work with those using a library like simplejson.

jQuery+ AJAX + Help Hiding This Content

First of all: I hope the following question is not too generic.
I have a small problem and I cant think of a good solution and I was hoping some1 here is able to help me.
This is my situation:
I am using AJAX to dynamically load pages. My main site is index.php and once I click on a navigation link, the AJAX script replaces the content of index.php with new content and adds a hash tag to the URL. For instance:
I click on the link to about.php, the script adds #about.php to the URL and loads content from about.php into index.php. It works great :) However, there is a small issue that I would like to resolve:
Lets say we start by navigating to index.php#about.php directly - this means the content of index.php is visible for 2,3 seconds and than gets replaced with content from about.php. And I would like to avoid that.
I came up with a few ideas, but they are all not really great:
1) Hide content -> than make AJAX call -> on completed AJAX show content again
Downside: The content is still visible for a second.
2) Hide content with CSS and show it after AJAX call
Downside: This would work perfectly, but users without Javascript (and the GoogleBot) will see an exmpty page only.
3) Use an empty index.php and put the content of it in main.php and automatically load main.php via AJAX on page load.
Downside: Would work too, but again, users without JS and GoogleBot will just see an empty page when the visit index.php
Thats all I can think of and all three solutions are not good, because I am worried the SEO value will dramatically decrease when I have an empty index.php (I could accept that users with no JS get nothing to see).
p.s. I read somewhere that when you have display:none in an external css file and block it with robots.txt, GoogleBot wont know the difference, but I am worried thats maybe not the case? Any1 got some experience?
Edit: I guess my whole question comes down to this:
Do you think hiding the whole content of index.php with CSS (and than show it with JS), will be a huge no-go for SEO or will it be okay with GoogleBot (afterall the content is still in the source, but not visible to the user)?
If you used query strings instead of the hash you could have index.php load the correct content at the server level.
A plugin like history.js can help you push URLS to the browser so that you still get your ajax browsing.
Wow where to start...first of all the page 'blink' I'll call it is 2-3 seconds for you but it is completely dependent on the users computer, how fast it executes the javascript, and how fast the AJAX call returns so you could have a much larger delay.
Second I wouldn't worry about Googlebot seeing any of the ajax content. While it's true googlebot does try to fiddle with some javascript it won't make the ajax call like a normal browser would. I'd be very surprised if Googlebot ever saw any of your Ajax loaded data.
Googlebot does a fantastic job of figuring out what content is delivered via html/css to a user when they visit your page. It also figures out if something is displayed or not and does a good job of deciding if that content is just stuffing or is something that really matters.
You're worried about what someone without javascript will see when the entirety of navigating your site is based in javascript. This doesn't seem to reconcile.
You've got PHP available. My suggestion is to forgo the AJAX stuff you're trying to do and do it in PHP. You can just as easily script the same behavior in PHP as you can in AJAX.
SEO NOTES:
If you're looking for solid SEO results I suggest making the static (non-javascript version) page as SEO friendly as possible. I like to 'pick the low hanging fruit' like making sure the page has one and only on H1 and that it has the most important keywords in it. seomoz is one of the best sites I've found for seo information.

Loading external content with jquery or iframe?

Hiho,
There's an existing website that i need to include into another site which goes like this:
a.mysite.com
and i need to fetch content from this site in my
www.mysite.com
website...
As i need to access the content of the iframe the Same origin policy produces a problem here.
What i did was to configure mod_proxy on Apache to proxy pass all requests from
www.mysite.com/a
to
a.mysite.com
This will work fine...but my problem is that im not sure what the best way would be to include those pages.
1. Idea
As the content of the iframe is a full featured site with a top navigation...left navigation etc....i would need to change the page template to only show the content box to be able to integrate that page in the iframe.
2. Idea
I could just load the DIV where the content lies through JQuery.load() and integrate it into my site.
What is the best way to accomplish such a task? How bad is both ideas from the SEO point of view?
Unless it involves significant rework, the best solution is to combine the two into a single HTML page on the server side (using server-side includes).
Advantages:
No problems with SEO as it's delivered as a single page. Content in iFrames and content loaded via AJAX (with an associated link in the HTML) are traversed, but only the link, not the content itself is associated with the main page. See: http://www.straightupsearch.com/search-marketing/best-practices/seo_iframes_a_g/
Faster page load - either of your suggestions will cause the main page to be loaded first before the other content is loaded.
No reliance on Javascript - your second method will fail completely if javascript is not supported / turned on.
Include all JS and CSS only once - your first method will require these to be duplicated in the <head> of each page. This is more of a long term advantage if you wish to achieve full integration of site "a". However, it can be a disadvantage initially, see below.
Disadvantage:
May cause conflicts with scripts and CSS between the two pages. However, this same problem exists with your second method.
If you must choose between either of the two options you proposed, I would not select the second as others have suggested. Significant amounts of static content should never be loaded via Ajax, and in this scenario gives you no additional benefits. At least iFrames guarantee no JS and CSS conflicts.
Use the 2nd approach (jQuery.load) and if you're working with HTML5, for browsers that support the History API you can change the URL to whatever the content is for that div.
Check out https://github.com/blog/760-the-tree-slider for an example of how github did it for their tree slider.
EDIT:
I am not sure how using an iFrame whose src points to your own domain affects search rankings but at best it's a grey area. I would assume that possibly some pagerank would trickle from the parent to the child but I have no clue how it would work for instance if a blogger linked to your page with the iframe that pointed to another page. This would be a pretty good question to ask at the Webmaster Help Forum
Always say no to iframes. jQuery+Ajax all the way.

Categories

Resources