Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
When google crawls a web page , do google bot crawl
content with code like style="display:none"
content with code like style="display:block"
the question i am asking because i have a website of F&Qs.
For user i want answers to be displayed only when he clicks on "answer/solution" link.
For google bot i want solution section to be crawled else my content of page becomes too less.
Yes google will see content that is both display none and display block
Your FAQ section will be seen by google bots.
See these articles:
Webmaster Guidelines: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769
Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site.
Hidden text and links: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66353
Hiding text or links in your content can cause your site to be perceived as untrustworthy since it presents information to search engines differently than to visitors. ... If you do find hidden text or links on your site, either remove them or, if they are relevant for your site's visitors, make them easily viewable.
It is debatable if Google does crawl hidden elements; you'll find 'experts' who will argue one way or another, however most of it is pure conjecture. What I like to do in these situations is apply the display: none via JavaScript / jQuery on $(document).ready() that way the user gets the experience you are looking for, while Google indexes the page like so:
<div class="question">Does Google see this?</div>
<div class="answer">Yes is JS is used!</div>
$(document).ready(function(){
$('.answer').hide();
});
It is important to note that this method is also debatable as some have indicated that Google has started executing JS as part of the crawl. That being said, I've had good results using this technique.
I hope this helps!
Related
I'm trying to make a script that highlights certain divs/creates new divs on chess.com, a widely used website; this will be used to highlight moves that are suggested by an engine I will be making. I am really wondering how I'm supposed to get the website to read a .js file from my personal computer. I am aware this is quite an open ended question, so I am expecting open ended answers.
Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 3 years ago.
Improve this question
Spent the past couple days pulling my hair out and on the road to a brain aneurysm over this.
I've created an HTML + CSS template for Ebay which looks great. Although it does not have the CSS rendered until you refresh the page after you initially open it.
To keep things simple and I'll post the exact test code im using for Ebay:
After its posted and the page is initially opened you see no CSS then the refresh makes the CSS render:
I have a professional page for a product im selling and this obviously wont fly.
This completely baffles me. Ive done everything. All browsers. Hosting the CSS on another site and pointing to it. Nothing works.
Really, anything that can in someway help would be deeply appreciated.Ill keep checking on this post every few hours.
Try using inline styles instead.
<p style="border: 1px groove black;">Thingy</p>
There is a known issue I have come across that causes problems:
When you first load an un-cached item, it loads the item's code directly within eBay. Upon refresh, it loads the item within an iframe (normal behaviour)
The first load causes issues as it carried css from the eBay main page style sheets.
try be more accurate and include a wrapper div and then style with .wrapper p {}
Also, link to an external style sheet to make your life easier updating!
Its from the wrong format/wording used in the inline style. Although they both work, it does not explain how one needs a refresh while the other does not.
You said:
it does not have the CSS rendered until you refresh the page after you initially open it.
This is maybe related to a eBay page load behavior. When you come from an external site (like pasting the eBay link to your browser) then eBay will load the page and will add their own CSS to your HTML tags. When you reload the page then eBay jumps into iframe mode and your Style elements are used.
Check topic #6 here: http://www.fix-css.com/2014/06/ebay-templates-coding-guide/
try to add border-width p{border:1px groove;}
I built a javascript menu list from a xml file and has used it as the navigation menu in over 20 pages.I used jQuery's ajax functoinality to implement this,the reason I used this technique was because if there is an update in the menu list I only have to edit the xml file for the changes to reflect in the menu list. I only realized later the technique I have used is not SEO friendly,since SE doesnt index dynamic Javascript content.Saying that I have provided a fall back for users that have diabled their java script by linking the xml file to a object tag in a noscript tag
<noscript>
<div>
<object data="menu/Menu.xml" type="all"></object>
</div>
</noscript>
Im not too sure if this is SEO friendly.
So my question really is how do one go about creating a menu list that is user friendly and that can be updated easily? If questions similar to mine have been answered before please point me to the links.I have done some searching and was not happy with the results I found but Im still looking for answers.
JavaScript is not SEO friendly. Anyway, you should be using a server side programming language like PHP's includes or Server Side Includes to do this.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
I was browsing around the web and I saw something I've never seen before.
on this site:
http://blogof.francescomugnai.com/2009/04/mega-roundup-of-geektool-scripts-inspiration-gallery/
When you navigate down the page, the images only load when they are in the visible portion of the browser.
I have never seen this before and was wondering if anyone else has and how exactly one would do it.
I'm guessing this is some sort of Wordpress plugin (that's what he's using) but I'm not sure.
Is it javascript? Are they actually loading on page load but just become visible later for a "snazzy" effect or is this actually useful for quicker page load times?
"wp-content/plugins/jquery-image-lazy-loading"
Lazy loader is a jQuery plugin written
in JavaScript. It delays loading of
images in (long) web pages. Images
outside of viewport (visible part of
web page) wont be loaded before user
scrolls to them. This is opposite of
image preloading.
Using lazy load on long web pages
containing many large images makes the
page load faster. Browser will be in
ready state after loading visible
images. In some cases it can also help
to reduce server load.
http://www.appelsiini.net/projects/lazyload
So it seems it goes through every image specified or inside of the context of an element and replaces the src with a placeholder gif before the images fully load, saves the original URI and when the image is "visible" it replaces the placeholder with the real image.
LazyLoad is no longer available according to the website. Apparently the code no longer works on new browsers and the author doesn't have time to update it.
The "appear" plug in is working well for me.
http://plugins.jquery.com/appear/
It allows you to specify a callback function for an element. The callback function is called when the element appears into view. From the site:
$('#foo').appear(function() {
$(this).text('Hello world');
});
If you look at the source of the page you referenced, it contains this bit of code:
jQuery(document).ready(function($){
jQuery(".SC img").lazyload({
effect:"fadeIn",
placeholder: "http://blogof.francescomugnai.com/wp-content/plugins/jquery-image-lazy-loading/images/grey.gif"
});
});
I suspect that's how they're accomplishing the effect. It uses the jQuery LazyLoad plugin, which can be found here:
http://www.appelsiini.net/projects/lazyload
As Sanjay pointed out, the jQuery LazyLoad plugin from Applesiini no longer works. Here is another jQuery plugin that I found. Just another option in addition to jQuery Appear.
http://plugins.jquery.com/project/LazyLoadOnScroll
http://ivorycity.com/blog/2011/04/19/jquery-lazy-loader-load-html-and-images-on-scroll/
I have been using a noscript tag to show a warning when users have JavaScript disabled or are using script blocking plugins like Noscript. The website will not function properly if JavaScript is disabled and users may not figure out why it is not working without the warning.
After the latest Google algorithm shuffle, I have seen the daily traffic drop to about 1/3 of what it was in the previous months. I have also seen pages that were ranking #1 or #2 in the SERPS drop out of the results. After doing some investigating in webmaster tools, I noticed that "JavaScript" is listed as #16 in the keywords section. This makes no sense because the site has nothing to do with JavaScript and the only place that word appears is in the text between the noscript tags.
It seems that Google is now including and indexing the content between the noscript tags. I don't believe that this was happening before. The warning is three sentences. I'd imagine that having the same three sentences appearing at the top of every single page on the site could have a damaging effect on the SEO.
Do you think this could be causing a problem with SEO? And, is there any other method to provide a warning to users who have JavaScript disabled in a way that won't be indexed or read by search engines?
Put the <noscript> content at the end of your HTML, and then use CSS to position it at the top of the browser window. Google will no longer consider it important.
Stack Overflow itself uses this technique - do a View Source on this page and you'll see a "works best with JavaScript" warning near the end of the HTML, which appears at the top of the page when you switch off JavaScript.
<noscript> is not meant for meaningless warnings like:
<noscript>
Oh, no! You don't have JavaScript enabled! If you don't enable JS, you're doomed. [Long explanation about how to enable JS in every browser ever made]
</noscript>
It's meant for you to provide as much content as you can, along with a polite mention that enabling JS will provide access to certain extra features. You'll find that basically every popular site follows this guideline.
I don't think using <noscript> is a good idea. I've heard that it is ineffective when the client is behind a JavaScript-blocking firewall - if the client's browser has JavaScript enabled the <noscript> tag won't activate, because, as far as the browser's concerned, JavaScript is fully operable within the document...
A better method IMO, is to have all would-be 'noscript' content hidden by JavaScript.
Here's a very basic example:
...
<body>
<script>
document.body.className += ' js-enabled';
</script>
<div id="noscript">
Welcome... here's some content...
</div>
And within your StyleSheet:
body.js-enabled #noscript { display: none; }
More info:
Replacing <noscript> with accessible, unobtrusive DOM/JavaScript
Reasons to avoid NOSCRIPT
Somebody on another forum mentioned using an image for the warning. The way I see it, this would have three benefits:
There wouldn't be any irrelevant text for search engines to index.
The code to display a single image is less bulky than a text warning (which gets loaded on every page).
Tracking could be implemented to determine how many times the image is called, to give an idea of how many visitors have JavaScript disabled or blocked.
If you combine this with something like the non-noscript technique mentioned by J-P, it seems to be the best possible solution.
Just wanted to post an interesting tidbit related to this. For a site of mine I have ended up doing something similar to what stack overflow uses, but with the addition of a "find out more" link as my users are not as technical as this site.
The interesting part is that following advice of people aboce, my solution ditched the noscript tag, instead opting to hide the message divs with javascript. But I found that if firefox is waiting for its master password, this hiding of the message is interupted, so I think I will go back to noscript.
If you choose a solution based on replacing the div content (if js is enabled, then the div content gets updated) rather than using a noscript tag, be careful about how google views this practice:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66353
I'm not sure google will consider it deceptive, but it's something to consider and research further. Here's another stackoverflow post about this: noscript google snapshot, the safe way