Javascript isn't good for SEO, is it? - javascript

If I decided to use some javascipt in my website like
$('#body').load(URL);
or
$.get(URL, {param:value}, function(){ ... });
or
window.title = 'TEXT';
Is it good for SEO? Or am I recommended to use pure PHP for data on the page for SEO purposes?

The question of if javascript is good for SEO or not is missing the point. We should pretty much assume that any content which is only available by javascript will not be crawled by the search engines. Google at least claims to be able to crawl some javascript only content but is fairly tight lipped about what exactly they can crawl. Other search engines probably don't crawl it and it's certainly the case that not all do. So assume it doesn't get crawled.
That doesn't mean it's bad for SEO.
If the content will contribute to your SEO, then it's bad for SEO. If the content is neutral to SEO, then it's neutral for SEO. So the answer to your question really depends on the nature of your content. If the content is part of your SEO campaign, then stick with server-side HTML generation be it PHP or some other method. Otherwise the question of SEO has no bearing on the decision to to use javascript or not. Accessibility would be another thing to take into account. Javascript only content is terrible for that.

The larger search engines can/do render limited amounts of javascript. However, for SEO purposes your best bet is rendering the content via HTML rather than javascript. A good rule of thumb is to utilize HTML for content/expressing limited content structure (e.g. paragraph type text = p, lists = ul/ol, headings = h1/h2/h3, etc...), CSS for presentation, and JS for client side programming. With that being said, always ensure a good user experience first. If you can do the above while providing a great user experience, great! If you can't, users first. Its likely you can keep both users and bots happy 95% of the time if you take the time to do so.
Further reading (sorry, I can only post one link as a new user):
Matt Cutts Interview (Check out #26 on Google Javascript Rendering)
A spider's view of Web 2.0
EDIT Added that for "a new user" ;) ~ drachenstern

I think first you should consider what SEO means. It means "Search Engine Optimization" ... how does a search engine get data in the first place for it to be optimized?
It does a GET on the page and whatever data is returned in the GET is processed. No JS engine. No POST data. So you should be optimizing for whatever data is returned on a GET.
Additionally, you tagged this with PHP, but the question has nothing to do with PHP.
Have you seen any of the questions on this list?
https://stackoverflow.com/search?q=javascript+seo

No Sir, Google does not translate flash and java script properly so it may not crawl those area using java script or flash content. I suggest you should keep your website simple but if it is necessary to keep flashy/java script content then you should keep a text base backup.

The first thing you should be asking is not what is good for SEO, but what is good for users. For users, loading data with JavaScript will give them an interactive page, where they can start seeing the page immediately while it is still loading and where the page can update without having to reload it.
From Google's Webmaster Guidelines and article on Cloaking, you should not assume that crawlers can understand JavaScript. This does not mean that you should not use JavaScript on your website, but rather that you should provide the textual equivalent in noscript tags, for use both by users with JavaScript disabled
as well as for crawlers, bearing in mind that the content of these noscript tags should be roughly equivalent to what was shown with JavaScript enabled; showing different content to users and to search engines is called "cloaking" and is frowned upon to say the least.

Google doesn't (yet) execute a page's Javascript (JS). So if your JS replaces/creates content on a page then the content would normally be invisible to the crawlers (not good).
But, the Googlers have implemented a url hack that enables your server to create pages (from the server, not from JS), with all the different varients of your JS page's content.
This solves the SEO problem of Ajax powered pages. At least for Google searches...
See Crawable Ajax

Javascript or any scripts for that matter should never be used to house your sites content, ever! The entire web is driven by HTML and CSS, and in rare cases XML languages, everything else is a headache when it comes to SEO. Ask yourself this question, what exactly is SEO and what is it that search engines are indexing? Javascript and all programming/scripting languages are proprietary, this means that they are NOT standards as defined by the W3C, which means they are essentially worthless when it comes to indexing content. On the other hand, HTML, CSS, and XML are real standards developed for the web! It's ok to use scripts to add additional functionality to your pages, embed apps like social networking plugins, etc, but you should never use them to hold your websites HTML, CSS, or actual content ever, for any reason. Here's a link to a good article that will explain why you should be using HTML and CSS, and not a million scripts, optimizing webpages using proper html markup. Scripts cause other problems besides code that is hard for search engines to decipher. For one, they are harder for browsers to process, causing pages to load much slower than "static" pages made with HTML and CSS would. Pages made with PHP tend to create "dynamic" URL's that users and search engines cannot read. This is why Google recommends people who use jsp or PHP for their webpages include a sitemap, otherwise your links will never be found and might as well not exist. Stick to the conventions! Lets face it, we have standards for a reason. If every electronic component in your home required had a different type of plug that required a special socked, and all those devices had differing voltage and amperage requirements, what would happen? You would essentially burn down your house! And, you'd be spending 5 hours a day at the hardware store looking for those special adapters to fit your wall sockets with. If you plan on designing a website, use scripts for embedding apps or connecting with a database only, and use HTML and CSS to build "static" webpages. Also, use text links, as they are both human and search engine readable, and easy to index and make sense of. Never use scripts for your links. Programming and scripting can be fun, but not on the internet its not.
Search engines index HTML, CSS, and content (multi-media, graphics, videos, text, thats it!) everything else is pointless and annoying to both users and search engines alike. For best results use XML and design a custom language.

Google can crawl, index and rank javascript generated content.
But... it uses an old Chrome version (42) with an old javascript render engine.
The consequence is that your javascript code needs to work in older browsers and older chrome versions (older than 42). So no fancy ES6 functions, you need to use polyfills or use Babel for example.
Although you can do a lot with javascript (like click events or inject your mobile menu), it's recommended to still use a-href instead of a button with a javascript event and then using a function to get to a new page.
You can check the mobile testing tool from Google: https://search.google.com/test/mobile-friendly and check the errors/warnings/logs. If the rendered output looks like intended, Google will see your content.
In search console you can also ask to index the page. Sometimes the javascript crawler is first, sometimes the 'classic' crawler.
Doublecheck it some days afterward by googling a sentence or paragraph from your page.
There's no answer about if it's better or not. Content is content and Google should rank your website, SPA, PWA, AMP site, PDF document, online Doc, wikipage, and so on based on their content, not on the underlying technique.
If you are familiar with JavaScript, give it a go.
Regards, Peter

Related

Will Google search results reflect website text that I replace with a script? [duplicate]

I am just wondering if Google or other search engines execute JavaScript on your web page. For example, if you set the title tag using JavaScript, does the Google search engine see that?
There have been some experiments performed for SEO purposes which indicate that at least the big players (Google, for example) can and do follow some simple JavaScript. They avoid sneaky redirects and such, but some basic content manipulation does seem to get through. (I don't have a link handy for Google themselves confirming or denying this, it's just various posts I've come across when dealing with this before.)
However, this is generally considered unreliable. If SEO is being done for any important purpose, don't rely on the spiders indexing much dynamic content.
There's actually a very good (in my opinion, anyway) answer here to a very similar question. What I like about that answer is how it breaks down the steps for generating good, indexable, and best of all maintainable web pages with concerns properly separated. Adhering as much as possible to this process generally results in good SEO, good accessibility, and good design skills in general.
Yes, Google executes Javascript. How much is a moving target.
Google executed some Javascript as early as 2011: http://searchengineland.com/google-can-now-execute-ajax-javascript-for-indexing-99518
This article from 2012 documents some experiments on what Javascript Google did and did not run at the time: http://moz.com/ugc/can-google-really-access-content-in-javascript-really
In May 2014, Google said publicly that they execute Javascript: http://googlewebmastercentral.blogspot.com/2014/05/understanding-web-pages-better.html Although that post says that Google has been getting better, there are no publicly available details on what Javascript Google does and does not execute -- but presumably they are at least as good at it as they were in 2012.
I'm pretty sure they dont. However, you can see for yourself: google have a tool which will show you your page as it sees it as http://www.google.com/webmasters/
if the text is in the onpage javascript, google will see the text. but it will not be seen as the text of the title element.
but hey, this is quite easy to test. just do it. wait two days. if you then google your site with site:.... look whats in the headline. if it's in there then the answer is yes: google sees it, if not: no google doesn't. it's easily testable.
(p.s.: my money is on: no)
We need to remember that JavaScript is client-side language, and always start executing from client-side. If all of titles or contents are via javascript then it'll be output from client-side, and I doubt it'll show up on Google search (meanwhile if outputted on .html, then yes).
If I am correct as of latest, meta tags are "fuel for search-engine", and it have ties to SEO, where it is common robots to be scripted to crawl on your site.

SEO for html single-page site via quasi-html content

Suppose, I have a javaScript-heavy single page web application. My Javascript render dom directly from model / datasource (Json).
I came up with an approach to generate simple html from datasource (on backend). This html is required only for search engines to index. After page is loaded, JavaScript will replace this quasi-html with the proper UI. Quasi-html can be removed from layout with display:none to avoid performance penalty on the browser.
Will it work?
Also I am concerned about legitimacy of the approach.
Thoughts?
It should work giving the search engines content to craw even if they don't read javascript. Now bots evolve and they read quite a bit of javascript nowadays, I've created a page that only has 2 sentences onBeforeLoad and uses Ajax to get the rest of the content and I see Google indexing a lot of the keywords delivered by Ajax. A problem would be misleading the search bot, like putting in content irrelevant to your other page content - something the bot might pick up at some point and penalize you for it. "I am concerned about legitimacy of the approach" - I wouldn't be, keep code valid and ride on

Search engine indexing of single page applications

Alright so I've been writing Backbone.js apps for over a year now and I love the framework model. I've learned how to avoid all the pitfalls and such, but there's one area I'm still quite weak as a single page app developer: how to SEO a public facing app.
I'm working on a blog project, and the easiest solution to my mind is to have a server generated list of all blog entries visible as a link from the /blog section that is rendered on page load, and to ensure that when hitting a /blog/:id url, the server loads the blog content into the very first div on the page, which will be set as display:none.
My question is if this should be sufficient for a good search engine index? SEO is still my weakest skill as a developer. Are there techniques for making sure a search engine crawls this content first and is able to use that content for its more complex indexing?
Also, is there a way to blacklist the generated app content on the page as I know Google has been testing crawling JavaScript apps? In my mind that could never be done at the level it needs to be without some sort of standard browser level event that can be triggered on a full page render or after all data has been loaded.
Anyways, this is more of an ambiguous ticket I know, but it could end up being useful to people in the future if we get a collection of good answers here.
Most of the major search engines (including Google) are rendering the content they receive from the website, in our (Google's) case with something close to a headless browser, so whatever you do for the users the search engines will also get it. Serving different stuff to search engines however will get you into a dangerous area, named cloaking.
Hiding the content with a display:none might backfire on you. We are giving hidden content way less weight in ranking.

Okay to demand Javascript?

I know from an historical perspective that it has always been frowned upon to not gracefully degrade when a visitor with Javascript disabled visits a site but how relevant is this now?
I'm asking since an application I'm building is going to make heavy use of Javascript for charting and user interaction with the charts.
Most of the most popular sites on the web are rendered more or less useless without it enabled. I can't even login to twitter without it enabled, core bits of Facebook stop working and trying to visit Google maps just redirects to their search home page.
The kind of interaction I'm planning would be next to impossible to do any other way, and even if there was a way there isn't the resources to develop two versions of things.
The problem I have is that the site also needs to be as accessible as possible. It's an inherently visual site which makes this problematic to say the least. I'm not even sure there is a solution to this particular problem. If it's visual it's not going to be suitable for blind people.
This depends entirely on your needs and what you want your demographic to be. A lot of sites use exclusively flash to silverlight; does that make them poorly designed? No. All it means is that the site caters to a very specific audience.
As long as you are accepting that there may be a (albeit) small populous that aren't going to use the site (or can't due to missing support) then it's fine.
Side-note: there are other means of conveying graphics other than javascript. Using server-side image generators is one way. You may also decide to use flash/silverlight plugins or activeX/applets. My advice would be to design the site as you see fit.
I would bet that the amount of javascript support these days is in the majority, so creating the site first then working on the discrepancies later wouldn't be a bad avenue. Youre still going to receive a high saturation of visitors just on the assumption that most clients support it.
Heck, look at sites using html5--it's not a standard, and no browser has to support it, but this still doesn't stop people from taking advantage of article, header & footer elements. Granted, there is html5shiv to gracefully down-grade it, but even that depends on JS being present.
Javascript has become so thoroughly integrated with web development that it is now users that browse without Javascript that have become a novelty.
What you should do is put up a friendly message that informs the user that the specific service your page provides (charts etc.) cannot be provided without JS enabled, and leave it at that.
What I would worry about is thoroughly detecting feature compatibility, because while people with JS disabled aren't that common, people with ancient browsers are.
Since you tagged this with accessibility, I will cover that perspective. As others have said, Javascript is part of something that we live with. While it isn't expected that the application can be made fully accessible to people with visual imparments, you have to keep in mind other disabilities as well such as mobility imparments. WebAIM has an article about what to keep in mind when you are coding JS events to make it as accessible as possible.
Depending on what these charts do, you should put the resulting data in an accessible table at the very least either somewhere on the page or a link to the data/table in proxcimity of the graph, so that it is easy to find for those who need it.
You didn't specify who you are making this for, so I will give some more information. Gez Lemon wrote an article about not using <noscript>, his summary is:
I'm surprised that the noscript element is conforming in HTML5, as there are much better techniques for ensuring that pages work with or without JavaScript. Despite early accessibility advice advocating use of the noscript element, best practice is to use unobtrusive JavaScript for progressive enhancement, rather than relying on fallback content.
This sort of echoes what I began my answer with. However, if you are doing this for a US or state government agency, you must still use <noscript> tags because it is a requirement of Section 508. The information with the <noscript> tags should be useful. EX:
<script type="text/javascript">
// Do something cool
</script>
<noscript>
Please perform your search again using the <a>javascript-free form</a>
to get the results in a table.
</noscript>

Factoring in SEO on a Flash Site

There have been many debates about this topic already here, but none of them fully answered my question so I figured I would pose it and hope I get one or two decent answers.
We're planning on relaunching our company website in the next few months. Our current site, for the most part, is text-driven and because of this we rank very well on Google, Yahoo, and Bing for our primary keywords. We want to increase the "Wow Factor" of the site a bit (we're an interactive agency) but still maintain a majority of our search engine footing. The option to use Flash, AJAX, and other technologies that are not considered to be search engine friendly have come up numerous times in our meetings and each time we have to evaluate what kind of impact it would have on us from an SEO perspective.
Assuming a good portion of the site content will be encapsulated within a Flash (swf) file, what would be the best course of action for maintaining current rankings? I've read numerous times that Google indexes Flash files but I am unsure as to what extent. Further, is there a method of telling Google not to index a Flash file (through a variable or otherwise)?
Finally, I had an idea that seemed sound in theory and wanted to put it out into the world and see what type of feedback I receive on it:
Again, assuming the whole page is in a Flash file living on index.html, would it be possible to build out the site as normal (set up a logical directory structure, add content to static pages within said structure, etc), specify paths to those static pages in a Google XML Sitemap file, and have the spiders crawl only those pages (which are rich in content) while the user experiences some concoction of Flash/Javascript/AJAX/etc? If this works, what would be the pros/cons of this solution? Thanks for bearing with me on this slightly off-kilter question.
Well referencing Google I found that they have made impressive strides into indexing Flash based web pages. The only limitation I found from reading the article is that they are currently still limited in their ability in these three areas:
Googlebot does not execute some types of JavaScript. So if your web
page loads a Flash file via
JavaScript, Google may not be aware of
that Flash file, in which case it will
not be indexed.
We currently do not attach content from external resources that are
loaded by your Flash files. If your
Flash file loads an HTML file, an XML
file, another SWF file, etc., Google
will separately index that resource,
but it will not yet be considered to
be part of the content in your Flash
file.
While we are able to index Flash in almost all of the languages found on
the web, currently there are
difficulties with Flash content
written in bidirectional languages.
Until this is fixed, we will be unable
to index Hebrew language or Arabic
language content from Flash files.
By the sounds of it you won't have any problems with any of the 3 'problems'. Based on this document Flash sounds like a viable option for you.
Adobe has been working on their end as well to accommodate the search engines in their stride to make SWFs more search engine friendly as well. So with the combined efforts of both Adobe and Google/Yahoo if you take a dip in ranking within a year or two the search algorithms will be better than they are even now.
As far as not indexing you should be able to add in a simple
User-agent: *
Disallow: /directory/
Disallow: /directory/page.html
to your robots.txt file.
Andrew,
I've had to deal with this sort of thing a few times and I'd recommend maintaining both a Flash site (for users) and an HTML site (for search engines). Here's how you do it:
With whatever server-side stuff you're using set up some kind of switch that determines whether a particular request is for HTML or for whatever your Flash movie consumes (XML, JSON, another SWF, whatever). Every page on your site should be able to return HTML and whatever you choose to feed your Flash movie. A query string parameter like "requestType=Flash" will work just fine.
Put all of the content in your HTML pages in a div tag and make the div invisible with CSS. Use SWFObject to check if the requesting browser supports Flash and, if it does, have SWFObject replace your HTML content with your Flash movie. Search engine spiders will ignore your scripts and simply crawl your HTML pages and if you'd like to show the HTML to users with browsers that don't support Flash (like mobile browsers), just make the HTML content visible after SWFObject has determined that the browser doesn't support Flash.
Once your Flash movie has loaded, have it request whatever data it needs from the server using the same URL of the page that it was loaded on, but with the addition of the switch variable above.
Handle navigation from that point on with SWFAddress. When a user clicks a button to request a new page, pass the request through SWFAddress first, which will update the browser history using the hash mark trick, and then have your Flash movie make its request to the server.
I'm currently working on a site for a friend that uses this technique here (I should note, to protect my pride, that the site is still very much a work in progress):
http://www.casabarbuenosaires.com/
A browser request to any page on the site will first return the HTML representation of that page (you can view source in your browser to see that). SWFObject then replaces the HTML content with a Flash movie that loads a custom XML description of the same page which the Flash movie then constructs and displays.
I've worked on sites in the past that have used this technique and gotten excellent search engine results. Since you don't need to worry too much about what your HTML site looks like to humans, you can focus solely on what it looks like to search engines.
Another added benefit of building your site this way is that you are compelled to separate your site's content/copy from its visual representation. Throwing your entire site into a single SWF is generally NOT a good way to do that. It's much easier to maintain (or re-skin or scrap) a site when your content isn't all mixed up with your code.
Hope this helps,
Scott

Categories

Resources