Issue with Google PageSpeed Insights - Prioritize Visible Content - javascript

While developing a full-fledged website for a client, who is slightly obsessed with Google tools and suggestions, I've come across the following issue:
No matter what I do, I cannot achieve a perfect score for the homepage of the site. All other pages are 100/100 on both mobile and desktop, but the landing page gets 91 and 97 respectively - as you can see in the attached pictures.
I have tried all relevant steps I could find, including correctly structuring the HTML code and asynchronously loading everything else, plus lazy-loading the images.
A dummy representation of the code would be this:
html head, metatags etc
inline blocks of css (to "fix" the render-blocking issues)
the html content, in correct order with above-the-fold first
deferred js load
My question is, what am I missing? What else can I do to achieve the perfect score?

For future reference and anyone facing the same problem:
I've solved this issue by reducing the filesize for my homepage PHP file. This was nowhere in the guidelines or the responses around the web, but it was the thing that bumped me up to 100/100.
In my particular instance, I removed every last bit of unnecessary CSS I had included. That alone reduced the size and removed the error.
I consider the issue erroneous however, and will proceed to report this to Google - as "prioritize visible content" has nothing to do with "reduce your file size".

Related

How can I break a third party iframe on my site, with code?

I am not a programmer.
Someone has scraped my site home page source code and placed their iframe over it, so that when the page is fetched it displays their content.
The iframe is not immediately apparent but it's there, just well hidden. These sites are all hosted on hacked servers running WordPress. They still display our site links and architecture that is being delivered by our server. There are currently over 160 such sites built using the same method.
I believe that they have disabled js so that may not be an option.
I know that we can break out of an iframe if it's our site in the frame.
Is there any way, either on the server side or on the page to break their iframe and force our page to the top?
If we can break it, then our code becomes worthless and with a bit of luck they may stop using it.
Update:
Just wanted to add a few points to anyone who has any ideas.
1, They already have the code, only things being served are the images and CSS files because they have only left those links in the page.
2, They are showing their site by floating it with a z-index on top of everything, which is why when you view src you see the site above and not the site that is floated in the iframe.
3, The iframe is visible if you inspect element with firefox and scroll to the top of the page you can see the iframe they are using.
Based upon the additions (currently in an answer), since they have your code there's not much you can do about breaking out of the iframe.
Depending upon your server environment you could try determining what page is requesting your images and CSS, and then display modified versions to those accessing the scraped versions. The key word for your searches is 'hotlinking.'
Possible modifications could include not serving the assets (images/CSS), or returning a CSS file that just does a display:none; on HTML elements to hide.
It might be a fool's errand, but trying to contact the hosts of the hacked servers might be a good idea, but I can't honestly say that it will get you very far, and might be a waste of time for the majority of them.

How to stop render-blocking Javascript and CSS?

I have tested my page on Google PageSpeed Insights
and It keeps saying "Eliminate render-blocking JavaScript and CSS in above-the-fold content"
It says I have 17 blocking scripts and 11 blocking CSS resources.
I have tried moving all of the JS to the bottom of the page to load it last however Google is still saying I have render-blocking JS...
Does anyone know how I can solve this?
Thank you in advance for any help.
You need to dig more into Critical Rendering Path study.
Simple Rule: In your webpage load only things which are really important to show to user. Rest things can be loaded after page load has been done.
When JS is being downloaded on page, it stops DOM generation and hence stop downloading other resources. Putting JS at bottom will work but still it will block your rendering when JS is being downloaded. Just to overcome this issue it is suggested to add async tag to your script tag. But adding async can lead to other issues. See here. More reading about this can be found here.
Same case applies to CSS but advantages is that it will not block DOM generation. More reading about this can be found here.
Hi I tried it but it slowed down the website not sure why so put it back saying Eliminate render-blocking JavaScript and CSS on Googles page insite.
If any one knows why by removing the blocking css it slowed down the website enter image description here

Google DoubleClick for Publishers fails too quietly

I'm trying to implement DoubleClick on a client's site and having a heck of a time. Part of the problem is that when things don't work, things just don't work. Nothing is logged to the console, no alert boxes appear, no uncaught exceptions are thrown, nothing. Nevertheless, through experimentation, I've managed to get things to the point where iframe tags are being inserted in the positions where the ads should be on the page; it just seems that the iframes aren't having src attributes given to them, so they're just appearing as blank areas on the page.
It would be wonderful if someone had an answer on how to solve that exact problem, but failing that, I'd settle for a way to coax the DoubleClick script into just making a bit more noise when something goes awry. When I look at the minified/obfuscated script being loaded from Google's servers, it does look like there's plain English strings woven in there representing various error cases, so I presume there's a way to make it display those strings to me…
From my understanding the iframes will not have src attributes... I think the reason they are used is as a form of sandboxing so that ads cannot interfere with the parent page in any way... an iframe allows different css, scripts etc without it conflicting.
If your ad units on your page are appearing blank then it is most probably because there are no line items that match... make sure you are using the build tags tool inside DFP... that hasn't failed me yet... also check out the debugging console, this should display any errors that you have with your page.

Javascript Charts don't load until FWD then BACK

I have a problem with the javascript amcharts, but the situation is likely to stem from a more generic javascript issue. Unfortunately it's being behind a development door; so no direct links. But, if I can paint the picture:-
1) Main index page uses jQuery.load() function to load a page of analytics and information.
2) Once loaded; jQuery.getScript() is used to fetch graph data.
3) Nothing appears
The same code works when together in one single static html page (i'm using example data just to get it running right now). It doesn't even work when I put it all in the .load()'d file together.
I've taken it further up the loading tree, so it's all in the main index page, no loading() bits. Still nothing.
However, if I go forward on the browser, then back. The graph is there... ready to go. Any ideas what might be leading to this behaviour?
Any thoughts, directions of investigation very, very welcome.
I've overcome this by using an iframe to load the graphs, not idea but workable solution.

page injected via jsquery/ajax does not show properly in any Chromium browser

as obviously the css related to the page being injected is not loaded by Chromium. However, it is working well in IE8/O 10.x/FF3.6x.
Hence begs the question - my stupidity in html coding, Chromium bug or jquery bug? that is what I could think of.
this is the page in question, eliminated all non-essential js http://logistik-experte.gmxhome.de/test.html, navigate to resume and see the difference. It is basically driving me nuts as missing the point somewhere and hence any sound advice/help would be highly appreciated.
cheers
I agree with Buggabill: works for me in Chrome 5. (At least on the server; there may be issues with loading files from a local filesystem.)
However there are problems with your approach. By having page content loaded by script only, you have made your page inaccessible to non-JavaScript users, which includes all search engines. Also you can't use the back button and the pages are unbookmarkable, un-open-in-new-tab-able, and so on.
Basically you've reinvented all the problems of <frameset>, the reasons why no-one uses frames any more. You shouldn't really deploy this kind of solution until you are familiar with the ways accessibility and usability can be served. At the very least, you need to point the navigation links to the real pages containing their content. Then consider allowing hash-based navigation, so the dynamically-loaded pages have a unique URL which can be navigated between, and which will re-load the selected page at load time when the URL is first entered.
Also if you are loading content into the page you should take care to load only the content you want, for example using load('portfolio.html #somewrapperdiv'). Otherwise you are inserting the complete HTML, including <!DOCTYPE> and <head> and all that, which clearly makes no sense.
To be honest, as it currently is, I don't see the point of the dynamic loading. You have spent a bunch of time implementing an unusual navigation scheme with many disadvantages over simple separate navigable pages, but no obvious advantage.

Categories

Resources